MARKOV CHAIN


Meaning of MARKOV CHAIN in English

/mahr"kawf/ , Statistics.

a Markov process restricted to discrete random events or to discontinuous time sequences.

Also, Markoff chain .

[ 1940-45; see MARKOV PROCESS ]

Random House Webster's Unabridged English dictionary.      Полный английский словарь Вебстер - Random House .