Concept information
Término preferido
cross attention
Definición
- An attention mechanism employed in the decoder of transformers that allows the model to consider information from different parts of the input sequence while generating the output sequence. (Based on Bharti, Unraveling Transformers: A Deep Dive into Self-Attention and Cross-Attention Mechanisms, on medium.com, 2024)
Concepto genérico
Etiquetas alternativas
- cross-attention mechanism
En otras lenguas
-
francés
URI
http://data.loterre.fr/ark:/67375/8LP-F3BF19DC-2
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}