See also: markovian

English edit

Etymology edit

Markov +‎ -ian, named for the Russian mathematician Andrey Markov.

Pronunciation edit

Adjective edit

Markovian (not comparable)

  1. (statistics, of a process) Exhibiting the Markov property, in which the conditional probability distribution of future states of the process, given the present state and all past states, depends only upon the present state and not on any past states.
    • 1992, Casella and George, Explaining the Gibbs Sampler, in: The American Statistician 46(3) 167–174
      It is not immediately obvious that a random variable with distribution   can be produced by the Gibbs sequence of (2.3) or that the sequence even converges. That this is so relies on the Markovian nature of the iterations, which we now develop in detail for the simple case of a 2 × 2 table with multinomial sampling.
    • 2018, Guidolin and Pedio, Essentials of Time Series for Financial Applications, Academic Press:
      AR(p) models are simple univariate devices to capture the often-observed Markovian nature of financial and macroeconomic data […]

Derived terms edit

Translations edit

See also edit