Illustration for "markov"

markov

Relating to stochastic processes in which the probability of each event depends only on the state attained in the immediately preceding event, not on earlier history.


Example

A Markov chain models a sequence of events where the probability of transitioning to any state depends only on the current state.


Synonyms

memoryless processstochastic chain



Master “markov” and 957 more GRE words

Start Your Free Trial