1. A markov chain with three states, S={1.2,3} has the following transition matrix:
a) Draw the state transition diagram for this chain.
b) If we know P(X1 =1)= P(X1= 2)=1/ 4. find P(X1= 3)
c) Is this chain irreducible?
d) Find the steady state probabilities for this chain.