Concept information
Preferred term
SapBERT
Definition
- "a pretraining scheme that self-aligns the representation space of biomedical entities." (Liu et al., 2021, p. 4228).
Broader concept
Entry terms
- Self-aligning pretrained BERT
Bibliographic citation(s)
- • Liu, F., Shareghi, E., Meng, Z., Basaldella, M., & Collier, N. (2021). Self-alignment pretraining for biomedical entity representations. In K. Toutanova, A. Rumshisky, L. Zettlemoyer, D. Hakkani-Tur, I. Beltagy, S. Bethard, R. Cotterell, T. Chakraborty, & Y. Zhou (Eds.), Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 4228–4238). Association for Computational Linguistics. doi:10.18653/v1/2021.naacl-main.334
based on
has design country
- United Kingdom
- United States
has for input language
has repository
is an application of
implements
is encoded in
has for license
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/LTK-W63LC46L-T
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}