Exploring Massively Multilingual, Massive Neural Machine Translation
- Orhan Firat, Akiko Eriguchi | Google Research, Microsoft
We will be giving an overview of the recent efforts towards universal translation at Google Research. From training a single translation model for 100+ languages to scaling neural networks beyond 80 billion parameters with 1000 layers deep Transformers and several research and engineering challenges that the project has tackled; multi-task learning with hundreds of tasks, learning under heavy data imbalance, trainability of very deep networks, understanding the learned representations, cross-lingual down-stream transfer and many more insights will be shared.
[Slides]
-
-
Christian Federmann
Senior Data Scientist
-
-
Watch Next
-
-
-
-
Accelerating Multilingual RAG Systems
- Nandan Thakur
-
-
MSR Talk: Unsupervised Speech Reverberation Control with Diffusion Implicit Bridges
- Eloi Moliner,
- Hannes Gamper
-
Strategic Subset Selection in Satellite Imagery: Machine Vision Insights
- Akram Zaytar,
- Simone Nsutezo Fobi
-
MEGA: Multi-lingual Evaluation of Generative AI
- Kabir Ahuja,
- Millicent Ochieng
-
MARI Grand Seminar - Large Language Models and Low Resource Languages
- Monojit Choudhury,
- Edward Ombui,
- Sunayana Sitaram
-