Suppose a TCP connection, with window size 1, loses every other packet. Those that do arrive have RTT = 1 second. What happens? What happens to Time Out? Do this for two cases:
(a) After a packet is eventually received, we pick up where we left off, resuming with Estimated RTT initialized to its pre time out value and Time Out double that.
(b) After a packet is eventually received, we resume with Time Out initialized to the last exponentially backed-off value used for the timeout interval.
In the following four exercises, the calculations involved are straightforward with a spreadsheet.