Passer au contenu principal

Vocabulary of natural language processing (POC)

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

multi-head attention  

Définition

  • An attention mechanism used in NLP tasks such as machine translation and summarization that involves dividing the input sequence into multiple heads and combining their outputs to produce a single output, allowing the attention mechanism to focus on different aspects and relationships in the input sequence. (Multi Head Attention, on medium.com, 2023)

Concept générique

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-G68PZTML-J

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 21/05/2024