Skip to main

Medical Subject Headings (thesaurus)

Search from vocabulary

Concept information

Término preferido

Markov Chains  

Tipo

  • mesh:Descriptor

Definición

  • A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

Etiquetas alternativas

  • Markov Chain
  • Markov Process

En otras lenguas

  • francés

  • Chaines de Markoff
  • Chaînes de Markoff
  • Chaînes de Markov
  • Chaines markoviennes
  • Processus de Markoff
  • Processus de Markov
  • Processus markovien
  • Processus markoviens

URI

http://data.loterre.fr/ark:/67375/JVR-CPTQQS1R-1

Descargue este concepto:

RDF/XML TURTLE JSON-LD Creado 1/1/99, última modificación 8/7/08