Right and Wrong?
How honest should managers be with employees about a company's worsening financial condition? When one business owner who owns a legal service firm mentioned to his employees that the business was not doing well, it ended up scaring them. "People started crying. One person gave notice and left for a job at another company."
1. What do you think?
2. What would be achieved by telling them?
3. Is not telling them unethical? Why or why not?