Problem:
Should the government be more involved in the regulation of how U.S. firms and their international operations. Or, should the watchful eyes be within the business instead of outside it? I truly believe that the majority of American businesses want to make money and do the right thing and are led by honest and ethical people. Does the media (and perhaps the current administration) tend to portray business as "evil" so that there is a view that businesses must be regulated and controlled by government?