What impact did WWI and WWII have upon American government, economy, and society?
How did attitudes towards what the government should/ should not do change as a result of the wartime experience?
How was American society and the economy(including race relations) transformed by the wartime experience?
(Remember that the experience of WWI & WWII may not be the same in every case so be sure to note which war had the impact you are describing).