These data are obtained upon the cpu time needed to solve an algorithm:
2.0 1.4 3.5 2.3 3.2 3.6 .1 3.5 2.2 2.1 2.4 1.5 2.2 2.3 2.7 1.9 1.7 1.8 3.1 1.5 1.5 2.6 2.8 2.5 2.5 3.9 .8 1.8 3.3 3.7
a. Determine the mean and standard deviation.
b. Compute a 99% confidence interval on the mean time required to solve a problem
c. Another algorithm takes average of 6.6 cpu time. The solutions obtained are equivalent. Does new algorithm appear to be more efficient than the other with respect to calculating time? Discuss