Problem
Nursing has traditionally been a female-dominated occupation. However, more and more men have been entering the nursing field. Do you think that nursing has a gender stigma attached to it? Why or why not? If so, what do you think it will take to overcome the stigma?