Investment A has an expected return of $25 million and standard deviation is $10 million Investment B has an expected return of $5 million and standard deviation is $30 million If you assume returns follow a normal distribution, which investment would give a better chance of getting at least a $40 million return? Explain how your answer could change if you knew returns followed a skewed distribution instead of a normal distribution?