Skip to main content

ThesoTM thesaurus

Search from vocabulary

Concept information

Preferred term

BERT  

Definition

  • "BERT is designed to pretrain deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers." (Devlin et al., 2019).

Broader concept

Entry terms

  • Bidirectional Encoder Representations from Transformers

Bibliographic citation(s)

  • • Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT : Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 [cs]. http://arxiv.org/abs/1810.04805

has application field

has design country

  • United States

has for input language

is implemented by

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-SSWGBD85-7

Download this concept: