Discussion Post
In this discussion, we're thinking more about that - what is about nursing that makes it seem to be more of a "woman's job" than a "man's job"? Why are there more male doctors than female, and why do female doctors end up more often in general medicine or pediatrics than men? Are those gender norms changing?