A store averages 40.7 customers per day with a standard deviation of 12.9 customers. If a random sample of 100 days is taken, what is the probability that the mean number of customers (the sample mean) will exceed 43 customers? Suppose the time per visit that each customer spends in the store is known to have a normal probability distribution. If a random sample of 10 customers has a sample mean of 6.03 minutes and a sample standard deviation of 0.125 minutes what is the probability that a randomly selected customer will spend more than 6.03 minutes in the store?