MARKOV CHAIN


Meaning of MARKOV CHAIN in English

< probability > (Named after Andrei Markov ) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A Markov process is governed by a Markov chain.

In simulation , the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

[Better explanation?]

(1995-02-23)

FOLDOC computer English dictionary.      Английский словарь по компьютерам FOLDOC.