Skip to main

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Término preferido

multi-head attention  

Definición

  • An attention mechanism used in NLP tasks such as machine translation and summarization that involves dividing the input sequence into multiple heads and combining their outputs to produce a single output, allowing the attention mechanism to focus on different aspects and relationships in the input sequence. (Multi Head Attention, on medium.com, 2023)

Concepto genérico

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-G68PZTML-J

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 21/5/24