The following is question from my study guide for my Final Exam:
Many people have argued that the wars of the Twentieth Century have transformed American society,
providing more opportunities for women and minorities in American life, changing the shape of our cities,
providing prosperity. In a well reasoned essay explain why you think that war did or did not change America.
If you do not believe war changed America, what did?