Passer au contenu principal

Vocabulary of natural language processing (POC)

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

cross attention  

Définition

  • An attention mechanism employed in the decoder of transformers that allows the model to consider information from different parts of the input sequence while generating the output sequence. (Based on Bharti, Unraveling Transformers: A Deep Dive into Self-Attention and Cross-Attention Mechanisms, on medium.com, 2024)

Concept générique

Synonyme(s)

  • cross-attention mechanism

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-F3BF19DC-2

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 13/05/2024