1. An image-encoding algorithm, when used to encode images of a certain size, uses a mean of 110 milliseconds with a standard deviation of 15 milliseconds. What is the probability that the mean time (in milliseconds) for encoding 50 randomly selected images of this size will be between 90 milliseconds and 135 milliseconds? What assumptions do we need to make?
2. In order to evaluate a new release of a database management system, a database admin- istrator runs a benchmark program several times and measures the time to completion in seconds. Assuming that the distribution of times is normal with mean 95 seconds and with standard deviation of 10 seconds, what proportion of measurement times will fall below 85 seconds?