1. Show that the greedy algorithm to minimize the mean completion time for multiprocessor job scheduling works.
2. The input is a set of jobs j1, j2, ... , jN, each of which takes one time unit to complete. Each job jiearns di dollars if it is completed by the time limit ti, but no money if completed after the time limit.
a. Give an O(N2) greedy algorithm to solve the problem.
b. Modify your algorithm to obtain an O(N log N) time bound. (Hint: The time bound is due entirely to sorting the jobs by money. The rest of the algorithm can be implemented, using the disjoint set data structure, in o(N log N).)