Loading...

In the news | New York Times

You Can’t Spell Creative Without A.I. 

April 8, 2020

Advances in software applications that process human language lie at the heart of the debate over whether computer technologies will enhance or even substitute for human creativity.

Headshot of Rangan Majumder for the Microsoft Research Podcast
Microsoft Research Podcast

Microsoft’s AI Transformation, Project Turing and smarter search with Rangan Majumder 

March 25, 2020

Rangan Majumder is the Partner Group Program Manager of Microsoft’s Search and AI, and he has a simple goal: to make the world smarter and more productive. But nobody said simple was easy, so he and his team are working…

Microsoft Research Blog

Turing-NLG: A 17-billion-parameter language model by Microsoft 

February 13, 2020 | Corby Rosset

This figure was adapted from a similar image published in DistilBERT. Turing Natural Language Generation (T-NLG) is a 17 billion parameter language model by Microsoft that outperforms the state of the art on many downstream NLP tasks. We present a…

Microsoft Research Blog

ZeRO & DeepSpeed: New system optimizations enable training models with over 100 billion parameters 

February 13, 2020 | DeepSpeed Team, Rangan Majumder, and Junhua Wang

The latest trend in AI is that larger natural language models provide better accuracy; however, larger models are difficult to train because of cost, time, and ease of code integration. Microsoft is releasing an open-source library called DeepSpeed, which vastly…

In the news | Fortune

A.I. and tackling the risk of “digital redlining” 

February 11, 2020

Last week, a Dutch court ordered the government in the Netherlands to stop using a machine-learning algorithm for detecting welfare fraud, citing human rights violations. The system, called System Risk Indicator (SyRI) in English, was being used by four Dutch…

In the news | Neowin

Microsoft builds the world’s largest transformer-based language generation model 

February 11, 2020

Transformer-based language generation models have enabled better conversational applications. Though they still have their shortcomings, which were recently exposed by a team at MIT, researchers continue improving them to build better, larger, and more robust models.

In the news | WinBuzzer

Microsoft’s New Turing NLG is the Largest Transformer Language Model 

February 11, 2020

Microsoft has developed a Transformer-based language generation model that it describes as the largest ever made. This week, Microsoft AI & Research announced Turing NLG, which is twice the size of its nearest competitor.

In the news | WinBuzzer

Microsoft DeepSpeed with Zero Can Train 100 Billion Parameter AI Models 

February 11, 2020

Microsoft has released a new open-source library called DeepSpeed, which, when combined with its ‘ZeRO’ module can train 100 billion parameter models without using the resources traditionally associated with that.

In the news | ITPro

Microsoft unveils ‘largest ever’ AI natural language model 

February 11, 2020

Microsoft has revealed its largest deep learning language model, the Turing Natural Language Generation (T-NLG), which is claimed to have a record-breaking 17 billion parameters. The T-NLG, according to Microsoft, outperforms the largest deep learning models to date: the University of Washington’s Grover-Mega and Nvidia’s MegatronLM, which…

  • Previous
  • 1
  • …
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • Next