Concept information
Terme préférentiel
transformer
Définition
- "sequence transduction model based entirely on attention, replacing the recurrent layers most commonly used in encoder-decoder architectures with multi-headed self-attention." (Vaswani et al., 2017, p. 10).
Concept générique
Synonyme(s)
- self-attention model
Référence(s) bibliographique(s)
- • Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017). Attention is all you need. arXiv:1706.03762 [cs]. http://arxiv.org/abs/1706.03762
base de
est implémenté par
- AbLang
- AlBERTo
- ALeaseBERT
- AlephBERT
- AraBERT
- araELECTRA
- AraGPT2
- ARBERT
- astroBERT
- AstroLLaMA
- BanglaBERT
- BART
- BARThez
- BEHRT
- BERTimbau
- BERTJapanese
- BERTje
- BERTopic
- BERT-PLI
- BERT-SentiX
- BERTurk
- BERTweet
- BETO
- BioALBERT
- BioBERT
- BioMedBERT
- BioMed-RoBERTa
- BLOOM
- BlueBERT
- byT5
- CamemBERT
- CancerBERT
- CANINE
- CharBERT
- ChemBERTa
- ClinicalBERT
- CodeBERT
- CodeGPT
- convBERT
- convBERTurk
- CoText
- COVID-Twitter-BERT
- Czech-B
- Czert-A
- DeBERTa
- DeLFT
- DistilBERT
- DrBERT
- E5
- ELECTRA
- entityBERT
- ERNIE
- estBERT
- Finnish BERT
- Flair
- FlauBERT
- Galactica
- Genomic ULMFit
- GenSLMs
- German BERT
- GluonNLP
- golgotha
- GottBERT
- GPT
- GPT-2
- GPT-3
- GreekBERT
- GreenBioBERT
- HateBERT
- HeBERT
- HerBERT
- IndicBERT
- IndoBERT
- KB-BERT
- KeBioLM
- KeyBERT
- KLUE-BERT
- KLUE-RoBERTa
- KM-BERT
- KoBERT
- KoGPT2
- KoreALBERT
- KR-BERT
- LaMDA
- Legal-BERT
- Legal-HeBERT
- Lemur
- Llama 2
- LongFormer
- MARBERT
- MASS
- MathBERT
- M-BERT
- MC-BERT
- medBERT
- MiniLM
- MobileBERT
- Mol-BERT
- mT5
- MuRIL
- nanoBERT
- NetBERT
- OAG-BERT
- OPT
- OuBioBERT
- PaLM
- ParsBERT
- PEGAUSUS
- PharmBERT
- PhoBERT
- PLBART
- PolishBERT
- polyBERT
- PolyNC
- ProkBERT
- ProteinBERT
- PTT5
- PubMedBERT
- RadBERT
- RBERT
- RobBERT
- RobeCzech
- RoBERTa
- RoBERTuito
- Romanian BERT
- ruBERT
- SapBERT
- SBERT
- sciBERT
- SciFive
- Slavic BERT
- spanBERT
- Spark NLP
- Spark NLP Python
- Spark NLP Scala
- srBERT
- StructBERT
- T5
- text
- TextAttack
- TinyBERT
- TOD-BERT
- Tohoku-BERT
- transformers
- TransPolymer
- UmBERTo
- UmlsBERT
- Unicoder
- UniLM
- UTH-BERT
- WangchanBERTa
- XLM-RoBERTa
- XLM-T
- XLNet
Traductions
-
français
-
modèle auto-attentif
-
modèle d'auto-attention
URI
http://data.loterre.fr/ark:/67375/LTK-PFDPNVQ7-3
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}