Loading...
Microsoft Translator releases literary Chinese translation

In the news | Microsoft Translator Blog

Microsoft Translator releases literary Chinese translation 

August 25, 2021

When reading ancient Chinese poetry, we often marvel at the very wonderful words ancient writers could use to describe people, events, objects, and scenes. This is a splendid cultural treasure that has been left behind for us. However, similar to Shakespeare’s verses in the English language,…

Woman putting her creative ideas on a dish of a scale and investor adding cash money on the other dish
Articles

The true cost of your idea 

August 24, 2021

Each idea we implement, whether it succeeds or fails, has hidden costs which we often don’t consider. These hidden costs may be contributing to your idea not being fully used by your customer.

In the news | Anyscale Blog

Fast AutoML with FLAML + Ray Tune 

August 24, 2021

FLAML is a lightweight Python library from Microsoft Research that finds accurate machine learning models in an efficient and economical way using cutting edge algorithms designed to be resource-efficient and easily parallelizable. FLAML can also utilize Ray Tune for distributed…

In the news | OctoML

With Apache TVM, Microsoft Research develops and serves the latest computer vision algorithms on live streams 

August 24, 2021

OctoML engineering collaborated with Microsoft Research on the “Watch For” project, an AI system for analyzing live video streams and identifying specified events within the streams. The collaboration sped up inference for the deep learning algorithms that analyze video streams,…

CodeBERT

In the news | Microsoft Innovation

Tech Minutes: CodeBERT 

August 19, 2021

A Pre-Trained Model for Programming and Natural Languages.

DeepSpeed MoE powers eight times bigger models using expert-parallelism + ZeRO-Offload compared with expert-parallelism only. A graph shows supported model sizes on NVIDIA A100 GPUs. DeepSpeed MoE scales near-linearly with respect to the number of GPUs. Z-code MoE (10B) consistently outperforms other systems on BLEU scores for an in-house 50 language test dataset. Read more in the blog post. 
Microsoft Research Blog

DeepSpeed powers 8x larger MoE model training with high performance 

August 18, 2021 | DeepSpeed Team and Z-code Team

Today, we are proud to announce DeepSpeed MoE, a high-performance system that supports massive scale mixture of experts (MoE) models as part of the DeepSpeed (opens in new tab) optimization library. MoE models are an emerging class of sparsely activated…

hand holding a book bridging the gap in primary education for children passing by
Articles

Designing for neurodivergent students: What we’ve learned so far 

August 18, 2021

Our guidelines for informing inclusive product design for students are a work in progress. We began with Microsoft’s universal design principles and the Universal Design for Learning guidelines and supplemented those with research other teams within the company conducted with…

One paper about table structure understanding accepted by KDD ’21! 

August 14, 2021

In the news | Microsoft Educator Developer Blog

Learning how to build a Microsoft Azure Health Bot 

August 13, 2021

The Microsoft Learn Student Ambassadors community is for students who want to use tech to solve real-world problems with like-minded peers, establish themselves as mentors and leaders in their community, and amplify their impact. The Microsoft Learn Student Ambassadors, Health League…

  • Previous
  • 1
  • …
  • 189
  • 190
  • 191
  • 192
  • 193
  • …
  • 568
  • Next