Skip to main content

ThesoTM thesaurus

Search from vocabulary

Concept information

Preferred term

ruBERT  

Definition

  • A BERT pre-trained language model for Russian language.

Broader concept

Bibliographic citation(s)

  • • Kuratov, Y., & Arkhipov, M. (2019). Adaptation of deep bidirectional multilingual transformers for Russian language. arXiv:1905.07213 [cs]. http://arxiv.org/abs/1905.07213

based on

has application field

has design country

  • Russia

has for input language

implements

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-R4K2SPTG-R

Download this concept: