Question: Suppose we can observe and measure with perfect accuracy the quarterly total returns obtained by both a manager and her appropriately defined benchmark. Suppose further that these returns are independent across time in both cases (that is, they lack serial correlation, like stock market returns). Suppose both the manager and the benchmark have quarterly return volatility (standard deviation across time) equal to 3%. And suppose the correlation between the manager and benchmark returns is þ 75%. By how much must the manager's five-year average quarterly return exceed that of her benchmark in order to be able to conclude with usual statistical significance that the manager's performance differential was not just a random outcome?