A Markov chain (with states {0, 1, 2, ... , J - 1}, where J is either finite or infinite) has transition probabilities {Pij; i, j ≥ 0}. Assume that P0j > 0 for all j > 0 and Pj0 > 0 for all j > 0. Also assume that for all i, j, k, we have PijPjkPki = PikPkjPji.
(a) Assuming also that all states are positive recurrent, show that the chain is reversible and find the steady-state probabilities {πi} in simplest form.
(b) Find a condition on {P0j; j ≥ 0} and {Pj0; j ≥ 0} that is sufficient to ensure that all states are positive recurrent.
Text Book: Stochastic Processes: Theory for Applications By Robert G. Gallager.