Follow
Teven Le Scao
Teven Le Scao
Hugging Face
Verified email at huggingface.co
Title
Cited by
Cited by
Year
HuggingFace's Transformers: State-of-the-art natural language processing
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
arXiv preprint arXiv:1910.03771, 2019
13262019
Multitask prompted training enables zero-shot task generalization
V Sanh, A Webson, C Raffel, SH Bach, L Sutawika, Z Alyafeai, A Chaffin, ...
arXiv preprint arXiv:2110.08207, 2021
1342021
How Many Data Points is a Prompt Worth?
T Le Scao, AM Rush
arXiv e-prints, arXiv: 2103.08493, 2021
108*2021
Datasets: A community library for natural language processing
Q Lhoest, AV del Moral, Y Jernite, A Thakur, P von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
242021
What Language Model Architecture and Pretraining Objective Work Best for Zero-Shot Generalization?
T Wang, A Roberts, D Hesslow, TL Scao, HW Chung, I Beltagy, J Launay, ...
arXiv preprint arXiv:2204.05832, 2022
52022
What Language Model to Train if You Have One Million GPU Hours?
T Le Scao, T Wang, D Hesslow, L Saulnier, S Bekman, MS Bari, ...
Challenges {\&, 2022
22022
Neural Differential Equations for Single Image Super-Resolution
T Le Scao
ICLR 2020 Workshop on Integration of Deep Neural Models and Differential …, 2020
22020
In-training Matrix Factorization for Parameter-frugal Neural Machine Translation
Z Kaden, TL Scao, R Olivier
arXiv preprint arXiv:1910.06393, 2019
12019
The system can't perform the operation now. Try again later.
Articles 1–8