The media and Hollywood, in particular, represent one avenue in which the general public becomes familiar with the role of nurses. How does the media positively or negatively influence the public's image of nursing?
What other avenues may better educate the general public on the role and scope of nursing as well as the changing health care system?
References
Bishop, J. (2009). The Negative Images of Nursing Portrayed on Grey's Anatomy, House and ER and its Effect on Public Perception and the Contemporary Nursing Shortage.
Nemeth, L. (2010). Nurse Jackie and nurse ethics: How TV and the media influence our public image. Beginnings (American Holistic Nurses' Association), 31(2), 8-10.