World War II completed America's rise to dominant world power status. Did American dominance make the world a better, safer place or a worse, less stable one? Explain.
OR
After the Civil War, how well did the United States live up to the rhetoric of the Declaration of Independence-"all men are created equal"-both at home and abroad?