To estimate the time required to provide a given laboratory procedure, suppose that we measured the amount of time required when the service was provided on 60 occasions.
Based on this sample, we obtained a mean of 20.32 minutes and a standard deviation of 3.82 minutes.
What can we say with a probability of 0.95 about the size of the error when we use 20.32 minutes as an estimate of the true average time required to provide the procedure?