Assignment:
A discrete-time process is a Markov chain if and only if
a) the state of the process at each time instant does not depend on the values assumed in any other instant;
b) the future states does not depend on the current state;
c) the future states does not depend on the past states;
d) the future states does not depend on the current state, given the past states;
e) the future states does not depend on the past states, given the current state.