Concept information
Término preferido
BERT
Definición
- BERT is designed to pretrain deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. (Devlin et al., 2019).
Concepto genérico
Etiquetas alternativas
- Bidirectional Encoder Representations from Transformers
En otras lenguas
-
francés
URI
http://data.loterre.fr/ark:/67375/8LP-NBH731S9-G
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}