Concept information
Término preferido
attention weight
Definición
- The degree of relevance or importance assigned to each word or token in a sequence when processing a task. Attention weights help models focus on pertinent information while performing tasks such as machine translation, text summarization, and question answering, allowing them to efficiently process and generate meaningful outputs.
Concepto genérico
Etiquetas alternativas
- attention matrix
En otras lenguas
-
francés
-
poids d'attention
URI
http://data.loterre.fr/ark:/67375/8LP-HK0MKFRQ-T
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}