Returns and the Bell Curve An investment has an expected return of 16 percent per year with a standard deviation of 8 percent. Assuming that the returns on this investment are at least roughly normally distributed, how frequently do you expect to lose money? Express your answer as a fraction. For example 11 out of 100 years = 11/100.