Let {Xn}n∈N0 be a Markov chain with the following transition matrix
Suppose that the chains starts from the state 1.
What is expected time that will pass before the chain first hits 3?
What is the expected number of visits to state 2 before 3 is hit?
Would your answers to (1) and (2) change if we replaced values in the first row of P by any other values (as long as P remains a stochastic matrix)? Would 1 and 2 still be transient states?
Use the idea of part (3) to answer the following question.
What is the expected number of visits to the state 2 before a Markov chain with transition matrix hits the state 3 for the first time (the initial state is still 1)?