MARKOV CHAIN


Meaning of MARKOV CHAIN in English

noun

or mar·koff chain ˈmärˌkȯf-

Usage: usually capitalized M

Etymology: after Andrei Andreevich Markov died 1922 Russ. mathematician

: a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved

Webster's New International English Dictionary.      Новый международный словарь английского языка Webster.