Microsoft Research ブログ
Research Focus: Week of October 28, 2024
New Research | FLASH: Workflow automatio…
Research at Microsoft 2023: A year of groundbreaking AI advances and discoveries
AI saw unparalleled growth in 2023, reac…
Announcing the DeepSpeed4Science Initiative: Enabling large-scale scientific discovery through sophisticated AI system technologies
| Shuaiwen Leon Song, Bonnie Kruft, Minjia Zhang, Conglong Li, Martin Cai, と Yuxiong He
Editor’s note, Sept. 28, 2023 – The foun…
DeepSpeed ZeRO++: A leap in speed for LLM and chat model training with 4X less communication
| DeepSpeed Team と Andrey Proskurin
Large AI models are transforming the dig…
Research Focus: Week of November 7, 2022
Welcome to Research Focus, a new series …
DeepSpeed Compression: A composable library for extreme compression and zero-cost quantization
| DeepSpeed Team と Andrey Proskurin
Large-scale models are revolutionizing d…
DeepSpeed: Advancing MoE inference and training to power next-generation AI scale
| DeepSpeed Team と Andrey Proskurin
In the last three years, the largest tra…
Research at Microsoft 2021: Collaborating for real-world change
Over the past 30 years, Microsoft Resear…
Tutel: An efficient mixture-of-experts implementation for large DNN model training
| Wei Cui, Yifan Xiong, Peng Cheng, と Rafael Salas
Mixture of experts (MoE) is a deep learn…