微软研究院博客
加载中…
AI saw unparalleled growth in 2023, reac…
| Shuaiwen Leon Song, Bonnie Kruft, Minjia Zhang, Conglong Li, Martin Cai, 和 Yuxiong He
Editor’s note, Sept. 28, 2023 – The foun…
微软研究院博客
DeepSpeed ZeRO++: A leap in speed for LLM and chat model training with 4X less communication
| DeepSpeed Team 和 Andrey Proskurin
Large AI models are transforming the dig…
微软研究院博客
DeepSpeed Compression: A composable library for extreme compression and zero-cost quantization
| DeepSpeed Team 和 Andrey Proskurin
Large-scale models are revolutionizing d…
| DeepSpeed Team 和 Andrey Proskurin
In the last three years, the largest tra…
Over the past 30 years, Microsoft Resear…
| Wei Cui, Yifan Xiong, Peng Cheng, 和 Rafael Salas
Mixture of experts (MoE) is a deep learn…