If a computer statistical package were to simulate 2000 random observations from a normal distribution with μ = 150 and σ = 25 what percentage of these observations would you expect to have a value of 140 or less?
Do you think the actual number in the " ≤ 140" range would equal the expected number in this range? If so, why? If not, why not?