Concept information
Preferred term
OPT
Definition
- a suite of decoder-only pre-trained transformers ranging from 125M to 175B parameters, to be shared fully and responsibly with researchers (after Zhang et al., 2022).
Broader concept
Entry terms
- Open Pre-trained Transformers
Bibliographic citation(s)
- • Zhang, S., Roller, S., Goyal, N., Artetxe, M., Chen, M., Chen, S., Dewan, C., Diab, M., Li, X., Lin, X. V., Mihaylov, T., Ott, M., Shleifer, S., Shuster, K., Simig, D., Koura, P. S., Sridhar, A., Wang, T., & Zettlemoyer, L. (2022). OPT: Open Pre-trained Transformer anguage Models (arXiv:2205.01068). arXiv. doi:10.48550/arXiv.2205.01068
has design country
- États-Unis
has repository
implements
is encoded in
is executed in
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/LTK-CQ6NWPMQ-3
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}