The normal distribution is commonly used to model the variability expected when making measurements. In this context, a measeured quantity x is assumed to have a normal distribution whose mean is assumed to be the "true" value of the object being measured. The precision of the measuring instrument determines the standard deviation of the distribution.
If the measurements of the length of an object have a normal probability distribution with a standard deviation of 1 mm, what is the probability that a single measurement will lie within 2 mm of the true length of the object?