(A Gambling Model)
Consider a gambler who, at each play of the game, either wins $1 with probability p or loses $1 with probability 1 - p. If we suppose that our gambler quits playing either when he goes broke or he attains a fortune of $N, then the gambler's fortune is a Markov chain having transition probabilities Pi,i+1 = p = 1 - Pi,i-1, i = 1, 2, ... , N - 1, P00 = PNN = 1
States 0 and N are called absorbing states since once entered they are never left. Note that the preceding is a ?nite state random walk with absorbing barriers (states 0 and N).