African Americans had been protesting inequality in America for a very long time, but it wasn't until after WWII that significant progress in civil rights was made. Why was the United States finally willing to embrace an idea that had, after all, been asserted in the Declaration of Independence -- that all are created equal? What had changed in the postwar era, and why?