We consider a discrete-time Markov chain whose state space is the set {0,1,2} and whose one-step transition probability matrix is
![](https://test.transtutors.com/qimg/0b3122cf-3aa3-4922-b346-306382f8f966.png)
We say that a renewal took place when the initial state 0 is revisited.
(a) What is the average number of transitions needed for a renewal to take place?
(b) Let N{t), for t > 0, be the number of renewals in the interval [0, t], where t is in seconds. If we suppose that every transition of the Markov chain takes one second, calculate
![](https://test.transtutors.com/qimg/91d1c684-3861-44f2-ab91-6370e4a99ac8.png)