Concept information
...
Quality of Health Care
Health Care Evaluation Mechanisms
Statistics as Topic
Stochastic Processes
Término preferido
Markov Chains
Tipo
-
mesh:Descriptor
Definición
- A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
Concepto genérico
Etiquetas alternativas
- Markov Chain
- Markov Process
En otras lenguas
-
francés
-
Chaines de Markoff
-
Chaînes de Markoff
-
Chaînes de Markov
-
Chaines markoviennes
-
Processus de Markoff
-
Processus de Markov
-
Processus markovien
-
Processus markoviens
URI
http://data.loterre.fr/ark:/67375/JVR-CPTQQS1R-1
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}