Discuss below in a 100 words:
Do you think all employers regardless of size should be required to provide employees mandated benefits such as Social Security deductions and disability benefits, Unemployment insurance, Medicare deductions, and Workers' Compensation and more? Certain employers starting next year will be required to provide health insurance. Do you agree with this? What benefits, if any, do you think should be legally mandated? Support your opinion with facts.