Passer au contenu principal

Vocabulary of natural language processing (POC)

Choisissez le vocabulaire dans lequel chercher

Concept information

NLP resources and evaluation > measure > attention weight

Terme préférentiel

attention weight  

Définition

  • The degree of relevance or importance assigned to each word or token in a sequence when processing a task. Attention weights help models focus on pertinent information while performing tasks such as machine translation, text summarization, and question answering, allowing them to efficiently process and generate meaningful outputs.

Concept générique

Synonyme(s)

  • attention matrix

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-HK0MKFRQ-T

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 29/05/2024