Suppose a time-sharing operating system is allotting time slices of 50 milliseconds. If it normally takes 8 milliseconds to position a disk's read/write head over the desired track and another 17 milliseconds for the desired data to rotate around to the read/write head, how much of a program's time slice can be spent waiting for a read operation from a disk to take place? If the machine is capable of executing ten instructions each microsecond, how many instructions can be executed during this waiting period? (This is why when a process performs an operation with a peripheral device, a time-sharing system terminates that process's time slice and allows another process to run while the first process is waiting for the peripheral device.)