Concept information
Término preferido
multi-head attention
Definición
- An attention mechanism used in NLP tasks such as machine translation and summarization that involves dividing the input sequence into multiple heads and combining their outputs to produce a single output, allowing the attention mechanism to focus on different aspects and relationships in the input sequence. (Multi Head Attention, on medium.com, 2023)
Concepto genérico
En otras lenguas
-
francés
URI
http://data.loterre.fr/ark:/67375/8LP-G68PZTML-J
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}