märˈkōvēən adjective
or mar·kov ˈmärˌkȯf, -ȯv ; also mar·koff -ȯf
Usage: usually capitalized
Etymology: Markov ( process ) + -ian
: of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states
Markovian models