Open main menu

Contents

EnglishEdit

NounEdit

Markov chain (plural Markov chains)

  1. (probability theory) A discrete-time stochastic process with the Markov property.

TranslationsEdit

See alsoEdit