[Markov chain] n [A. A. Markov d. 1922 Russ. mathematician] (1942): a usu. discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved--called also Markoff chain
MARKOV CHAIN
Meaning of MARKOV CHAIN in English
Merriam-Webster English vocab. Английский словарь Merriam Webster. 2012