Movies have a definite representative factor when you consider their place in American society. I feel that movies have become a part of the American experience and help us obtain our own identity. Depending on when you grew up, we all have "those movies" that defined the era in which we grew up. Films today have a knack for becoming ingrained in our pop culture and contribute to the distinctly "American" way of life. Would you agree with this notion? If so, how do you feel movies shaped your youth?