Skip to main

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Término preferido

cross attention  

Definición

  • An attention mechanism employed in the decoder of transformers that allows the model to consider information from different parts of the input sequence while generating the output sequence. (Based on Bharti, Unraveling Transformers: A Deep Dive into Self-Attention and Cross-Attention Mechanisms, on medium.com, 2024)

Concepto genérico

Etiquetas alternativas

  • cross-attention mechanism

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-F3BF19DC-2

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 13/5/24