We consider a discrete-time Markov chain whose state space is the set {0,1,2} and whose one-step transition probability matrix is
We say that a renewal took place when the initial state 0 is revisited.
(a) What is the average number of transitions needed for a renewal to take place?
(b) Let N{t), for t > 0, be the number of renewals in the interval [0, t], where t is in seconds. If we suppose that every transition of the Markov chain takes one second, calculate