Follow
Maha Elbayad
Maha Elbayad
Research scientist, Meta AI
Verified email at fb.com - Homepage
Title
Cited by
Cited by
Year
Pervasive attention: 2D convolutional neural networks for sequence-to-sequence prediction
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:1808.03867, 2018
922018
Depth-adaptive transformer
M Elbayad, J Gu, E Grave, M Auli
arXiv preprint arXiv:1910.10073, 2019
742019
Findings of the IWSLT 2022 Evaluation Campaign
A Anastasopoulos, L Barrault, L Bentivogli, MZ Boito, O Bojar, R Cattoni, ...
Proceedings of the 19th International Conference on Spoken Language …, 2022
382022
Efficient wait-k models for simultaneous machine translation
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:2005.08595, 2020
362020
Token-level and sequence-level loss smoothing for RNN language models
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:1805.05062, 2018
202018
No language left behind: Scaling human-centered machine translation
NLLB Team, MR Costa-jussà, J Cross, O Çelebi, M Elbayad, K Heafield, ...
arXiv preprint arXiv:2207.04672, 2022
19*2022
Online versus offline NMT quality: An in-depth analysis on English-German and German-English
M Elbayad, M Ustaszewski, E Esperança-Rodier, FB Manquat, J Verbeek, ...
arXiv preprint arXiv:2006.00814, 2020
82020
On-trac consortium for end-to-end and simultaneous speech translation challenge tasks at iwslt 2020
M Elbayad, H Nguyen, F Bougares, N Tomashenko, A Caubrière, ...
arXiv preprint arXiv:2005.11861, 2020
82020
Joint source–target encoding with pervasive attention
M Elbayad, L Besacier, J Verbeek
Machine Translation 35 (4), 637-659, 2021
2021
Proceedings of the Second Workshop on Automatic Simultaneous Translation
H Wu, C Cherry, L Huang, Z He, Q Liu, M Elbayad, M Liberman, H Wang, ...
Proceedings of the Second Workshop on Automatic Simultaneous Translation, 2021
2021
Rethinking the Design of Sequence-to-Sequence Models for Efficient Machine Translation
M Elbayad
Université Grenoble Alpes [2020-....], 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–11