Consider a Markov process for which the embedded Markov chain is a birth-death chain with transition probabilities Pi,i+1 = 2/5 for all i ≥ 1, Pi,i-1 = 3/5 for all i ≥ 1, P01 = 1, and Pij = 0 otherwise.
(a) Find the steady-state probabilities {πi; i ≥ 0} for the embedded chain.
(b) Assume that the transition rate out of state i, for i ≥ 0, is given by νi = 2-i. Find the transition rates {qij} between states and show that there is no probability vector solution {pi; i ≥ 0} to (7.23).
(c) Argue that the expected time between visits to any given state i is infinite. Find the expected number of transitions between visits to any given state i. Argue that, starting from any state i, an eventual return to state ioccurs with probability 1.
(d) Consider the sampled-time approximation of this process with δ = 1. Draw the graph of the resulting Markov chain and argue why it must be null recurrent.
Text Book: Stochastic Processes: Theory for Applications By Robert G. Gallager.