Orange County managed an investment pool into which several municipalities made short-term investments. A total of $7.5 billion was invested in that pool, and this money was used to purchase securities. Using these securities as collateral, the pool borrowed $12.5 billion from Wall Street brokerages, and these funds were used to purchase additional securities. The $20 billion total was invested primarily in long-term fixed-income securities to obtain a higher yield than short-term alternatives. Furthermore, as interest rate slowed declined, as they did in 1992-1994, an even greater return was obtained. Things fell apart in 1994, when interest rates rose sharply.
Hypothetically, assume that initially the duration of the invested portfolio was 10 years, the short-term rate was 6%, the average coupon rate on the portfolio was 8.5% of face value, the cost of Wall Street money was 7%, the short-term interest rates were falling at 0.5% per pear.
What was the rate of return that pool investors obtained during this early period? Does it compare favorably with the 6% that these investors would have obtained by investing normally in short-term securities?
When interest rates had fallen two percentage points and began increasing at 2% per year, what rate of return was obtained by the pool?