Suppose a network transmits 1024-byte packets containing a 128-byte header and a 4-byte checksum. If a workstation on the network is guaranteed to be able to transmit at least one packet every x time units,
(1) what maximum amount of time, as a function of x, should be required (based on these factors) to transfer a 3MB file from a server to a workstation?
(2) what is the effective transfer rate from the server to the workstation?