MARKOVIAN


Meaning of MARKOVIAN in English

märˈkōvēən adjective

or mar·kov ˈmärˌkȯf, -ȯv ; also mar·koff -ȯf

Usage: usually capitalized

Etymology: Markov ( process ) + -ian

: of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states

Markovian models

Webster's New International English Dictionary.      Новый международный словарь английского языка Webster.