Skip to main content

Astronomy (thesaurus)

Search from vocabulary

Concept information

Preferred term

Markov chain  

Definition

  • A Markov chain is a sequence of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors. In other words, a Markov chain describes a chance process in which the future state can be predicted from its present state as accurately as if its entire earlier history was known. Markov chains are named after the Russian mathematician Andrei Andrevich Markov (1856–1922) who first studied them in a literary context, applying the idea to an analysis of vowels and consonants in a text by Pushkin. But his work launched the theory of stochastic processes and has since been applied in quantum theory, particle physics, and genetics. (Encyclopedia of Science, by David Darling, https://www.daviddarling.info/encyclopedia/M/Markov_chain.html)

Broader concept

Entry terms

  • Markovian chain

In other languages

URI

http://data.loterre.fr/ark:/67375/MDL-FHGMV6FX-D

Download this concept: