Skip to main content

ThesoTM thesaurus

Search from vocabulary

Concept information

Preferred term

E5  

Definition

  • A language model for text embeddings.

Broader concept

Bibliographic citation(s)

  • • Wang, L., Yang, N., Huang, X., Jiao, B., Yang, L., Jiang, D., Majumder, R., & Wei, F. (2022). Text embeddings by weakly-supervised contrastive pre-training (arXiv:2212.03533). arXiv. doi:10.48550/arXiv.2212.03533
  • • Wang, L., Yang, N., Huang, X., Yang, L., Majumder, R., & Wei, F. (2023). Improving text embeddings with large language models (arXiv:2401.00368). arXiv. http://arxiv.org/abs/2401.00368

has design country

  • United States

has for input language

implements

is encoded in

is executed in

has for license

In other languages

  • E5

    French

URI

http://data.loterre.fr/ark:/67375/LTK-LMMM2HZB-2

Download this concept: