Concept information
Preferred term
KL divergence
Definition
- A measure used to compare two probability distributions, for example of words generated by a language model or in a reference corpus.
Broader concept
In other languages
-
French
-
divergence de Kullback-Leibler
URI
http://data.loterre.fr/ark:/67375/8LP-H8LWM509-0
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}