The theory of medical dominance suggests that doctors have a key authoritative role in the health care profession. Considering that this is a way of thinking that has been in existence since the early part of the twentieth century,do you believe that this remains true? Discuss why you agree or disagree with this line of thinking.