Skip to main content

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Preferred term

attention head  

Definition

  • An individual component in a multi-head attention mechanism which focuses on specific parts of input text, helping a language model to understand which words are most important for a particular task, like understanding meaning or answering a question.

Broader concept

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-X3T8021Q-6

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 5/13/24