Skip to main content

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

NLP resources and evaluation > measure > attention weight

Preferred term

attention weight  

Definition

  • The degree of relevance or importance assigned to each word or token in a sequence when processing a task. Attention weights help models focus on pertinent information while performing tasks such as machine translation, text summarization, and question answering, allowing them to efficiently process and generate meaningful outputs.

Broader concept

Entry terms

  • attention matrix

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-HK0MKFRQ-T

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 5/29/24