Skip to main

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

NLP resources and evaluation > measure > attention weight

Término preferido

attention weight  

Definición

  • The degree of relevance or importance assigned to each word or token in a sequence when processing a task. Attention weights help models focus on pertinent information while performing tasks such as machine translation, text summarization, and question answering, allowing them to efficiently process and generate meaningful outputs.

Concepto genérico

Etiquetas alternativas

  • attention matrix

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-HK0MKFRQ-T

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 29/5/24