Is it the United States government's obligation to provide health, safety and also the welfare of its citizens? Internally should the United States government protect us from unsafe products and environmental conditions? Recently, the United States government has created access to insurance companies to keep up with our health care too. Is this a good role for government?