Teven Le Scao
Teven Le Scao
Hugging Face
Verified email at
Cited by
Cited by
HuggingFace's Transformers: State-of-the-art natural language processing
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
arXiv preprint arXiv:1910.03771, 2019
Multitask prompted training enables zero-shot task generalization
V Sanh, A Webson, C Raffel, SH Bach, L Sutawika, Z Alyafeai, A Chaffin, ...
arXiv preprint arXiv:2110.08207, 2021
How Many Data Points is a Prompt Worth?
T Le Scao, AM Rush
arXiv e-prints, arXiv: 2103.08493, 2021
Datasets: A community library for natural language processing
Q Lhoest, AV del Moral, Y Jernite, A Thakur, P von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
What Language Model Architecture and Pretraining Objective Work Best for Zero-Shot Generalization?
T Wang, A Roberts, D Hesslow, TL Scao, HW Chung, I Beltagy, J Launay, ...
arXiv preprint arXiv:2204.05832, 2022
What Language Model to Train if You Have One Million GPU Hours?
T Le Scao, T Wang, D Hesslow, L Saulnier, S Bekman, MS Bari, ...
Challenges {\&, 2022
Neural Differential Equations for Single Image Super-Resolution
T Le Scao
ICLR 2020 Workshop on Integration of Deep Neural Models and Differential …, 2020
In-training Matrix Factorization for Parameter-frugal Neural Machine Translation
Z Kaden, TL Scao, R Olivier
arXiv preprint arXiv:1910.06393, 2019
The system can't perform the operation now. Try again later.
Articles 1–8