Problem
Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. How long will this same program take if 1 in every 100 instructions has a page fault and each page fault takes 100 milliseconds to resolve?