MARKOV PROCESS


Meaning of MARKOV PROCESS in English

sequence of possibly dependent random variables (x1, x2, x3, . . . )identified by increasing values of a parameter, commonly timewith the property that any prediction of the value of xn, knowing x1, x2, . . . , xn - 1, may be based on xn - 1 alone. That is, the future value of the variable depends only upon the present value and not on the sequence of past values. These sequences are named for A.A. Markov, who was the first to study them systematically. Sometimes the term Markov process is restricted to sequences in which the random variables can assume continuous values, and analogous sequences of discrete-valued variables are called Markov chains. See also stochastic process.

Britannica English vocabulary.      Английский словарь Британика.