We need to take a good hard look at the problems we’re solving before we get into solutioning. Yes, coming up with solutions is the interesting and exciting piece but once you’re on this path it’s very hard to turn…
In the news | Talking Beats
"As I scientist I ask-- where's the data on COVID vaccines? How many women, how many men, how many shots? Which arm? Did it cause their diabetes to get better? Did it affect their time off of work? What kind…
Awards | ICDM 2021
Wei Chen was awarded the 2021 IEEE ICDM 10-Year Highest-Impact Paper Award for his paper, IRIE: Scalable and Robust Influence Maximization in Social Networks - Microsoft Research.
This year six papers were chosen as recipients of the Outstanding Paper Award. The committee selected these papers due to their excellent clarity, insight, creativity, and potential for lasting impact. Additional details about the paper selection process are provided below.…
| Mengyu Dai and Junwon Park
Generative image models offer a unique value by creating new images. Such images can be sharp super-resolution versions of existing images or even realistic-looking synthetic photographs. Generative Adversarial Networks (GANs) and their variants have demonstrated pioneering success with the framework…
In the news | MarTechPost
Tutel is a high-performance MoE library developed by Microsoft researchers to aid in the development of large-scale DNN (Deep Neural Network) models; Tutel is highly optimized for the new Azure NDm A100 v4 series, and Tutel’s diverse and flexible MoE algorithmic…
Awards | IEEE
Dr. Xing Xie was elevated to IEEE Fellow 2022 for his contributions to spatial data mining and recommendation systems. IEEE Fellow is a distinction reserved for select IEEE members whose extraordinary accomplishments in any of the IEEE fields of interest…
| Wei Cui, Yifan Xiong, Peng Cheng, and Rafael Salas
Mixture of experts (MoE) is a deep learning model architecture in which computational cost is sublinear to the number of parameters, making scaling easier. Nowadays, MoE is the only approach demonstrated to scale deep learning models to trillion-plus parameters, paving…
In the news | Microsoft Translator Blog
Microsoft is on a quest for AI at Scale with high ambition to enable the next generation of AI experiences. The Microsoft Translator ZCode team is working together with Microsoft Project Turing and Microsoft Research Asia to advance language and…