Should employers continue to carry the burden of providing health care benefits to employees, or should the government institute a form of national health insurance instead? What difference might this make for the ability of U.S. companies, such as automobile manufacturers, to compete internationally when most other developed countries provide national health insurance?