Concept information
Preferred term
BERT
Definition
- BERT is designed to pretrain deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. (Devlin et al., 2019).
Broader concept
Entry terms
- Bidirectional Encoder Representations from Transformers
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/8LP-NBH731S9-G
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}