What do you think is meant by the statements above? How did the New Deal change America?
Do you think the New Deal changed the relationship between citizens and the federal government and if so, how? For example, should the government help regulate the economy?