Cyclic Chains:
In Markov Chains the current state of the system depends on all previous states. It is a stochastic process. Sometimes transition probability matrices are different from State I to State II, State II to State III etc and from cyclic chains which repeat in the same order.
Recurrent State:
A state is recurrent if it is certain to take place again, given that it has happened at least once, otherwise it is said to be transient. In other words, a transient state will take place only a limited number of times and then go away forever whereas the recurrent state is permanent. If a process is finite (i.e. has a finite number of states) and is irreducible, then all states are recurrent. This is the most common form of Markov Chain in applications.