Skip to main content

ThesoTM thesaurus

Search from vocabulary

Concept information

Preferred term

DeBERTa  

Definition

  • "Transformer-based neural language model DeBERTa (Decoding-enhanced BERT with disentangled attention), which improves previous state-of-the-art PLMs [Pre-trained Language Models] using two novel techniques: a disentangled attention mechanism, and an enhanced mask decoder." (He et al., 2020).

Broader concept

Entry terms

  • Decoding-enhanced BERT with disentangled attention

Bibliographic citation(s)

based on

has application field

has for input language

implements

is encoded in

is executed in

has for license

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-G64XHKJT-9

Download this concept: