Concept information
Terme préférentiel
cross attention
Définition
- An attention mechanism employed in the decoder of transformers that allows the model to consider information from different parts of the input sequence while generating the output sequence. (Based on Bharti, Unraveling Transformers: A Deep Dive into Self-Attention and Cross-Attention Mechanisms, on medium.com, 2024)
Concept générique
Synonyme(s)
- cross-attention mechanism
Traductions
-
français
URI
http://data.loterre.fr/ark:/67375/8LP-F3BF19DC-2
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}