There are two algorithms called Alg1 and Alg2 for a problem of size n. Alg1 runs in n2 microseconds and Alg2 runs in 100n log n microseconds. Alg1 can be implemented using 4 hours of programming time and needs 2 miniutes of CPU time. If programmer are paid 20 dollars per hour and CPU time costs 50 dollars per minute, how many times must a problem of size 500 be solved using Alg2 in order to justify its development cost?