[Markov process] n (1939): a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also: markov chain--called also Markoff process
MARKOV PROCESS
Meaning of MARKOV PROCESS in English
Merriam-Webster English vocab. Английский словарь Merriam Webster. 2012