Video cards based on Nvidia's GeForce2 processor typicallycost $250. Nvidia realeased a light version of the chip that costs $150. If a certain game makerwas purchasing 3000 cards per quarter, what was the present worth of the savings associated with the cheaper chip over a two-year period at an interest rate of 16% per year, compounded quarterly?