News & features
Microsoft Translator enhanced with Z-code Mixture of Experts models
| Hany Hassan Awadalla, Krishna Doss Mohan, and Vishal Chowdhary
Translator, a Microsoft Azure Cognitive Service, is adopting Z-code Mixture of Experts models, a breakthrough AI technology that significantly improves the quality of production translation models. As a component of Microsoft’s larger XYZ-code initiative to combine AI models for text,…
Microsoft Translator: Now translating 100 languages and counting!
| Krishna Doss Mohan and Jann Skotdal
Today, we’re excited to announce that Microsoft Translator has added 12 new languages and dialects to the growing repertoire of Microsoft Azure Cognitive Services Translator, bringing us to a total of 103 languages! The new languages, which are natively spoken…
DeepSpeed powers 8x larger MoE model training with high performance
| DeepSpeed Team and Z-code Team
Today, we are proud to announce DeepSpeed MoE, a high-performance system that supports massive scale mixture of experts (MoE) models as part of the DeepSpeed (opens in new tab) optimization library. MoE models are an emerging class of sparsely activated…
A holistic representation toward integrative AI
| Xuedong Huang
At Microsoft, we have been on a quest to advance AI beyond existing techniques, by taking a more holistic, human-centric approach to learning and understanding. As Chief Technology Officer of Azure AI Cognitive Services, I have been working with a…