Concept information
...
Quality of Health Care
Health Care Evaluation Mechanisms
Statistics as Topic
Stochastic Processes
Preferred term
Markov Chains
Type
-
mesh:Descriptor
Definition
- A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
Broader concept
Entry terms
- Markov Chain
- Markov Process
In other languages
-
French
-
Chaines de Markoff
-
Chaînes de Markoff
-
Chaînes de Markov
-
Chaines markoviennes
-
Processus de Markoff
-
Processus de Markov
-
Processus markovien
-
Processus markoviens
URI
http://data.loterre.fr/ark:/67375/JVR-CPTQQS1R-1
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}