Question: The development of unions was a significant turning point for workers in the United States. Over the past few decades union membership has been declining, are unions no longer needed in the United States? Do you think the development of such a broad system of employment laws has decreased the need for unions? Why or why not?