A student ran numerical simulation on a dual-core computer (3.2 GHz, 2MB Smartcache, 32GB RAM).
Each simulation involve calculation of 2GB of data and takes about 3 hours.
To speed up the calculation, he ran two simulations simultaneously. But it turned out that the computer had taken 8 hours to complete the jobs, i.e., the average time to complete one job is 4 hours.
Explain why the performance is poorer than only running one simulation at a time.