[ 'mα:kɒf ]
(also Markov chain )
■ noun Statistics a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Origin
named after the Russian mathematician Andrei A. Markov (1856–1922).