Having a moment when it comes to answering this problem. Use the Limit theorem to find the mean and standard error.
The amounts of time employees of a telecommunications company have worked for the company are normally distributed with a mean of 5.9 years and a standard deviation of 2.2 years. Random samples of size 15 are drawn from the population and the mean of each sample is determined..
Help please