Loading...

In the news | WinBuzzer

Microsoft’s New Turing NLG is the Largest Transformer Language Model 

February 11, 2020

Microsoft has developed a Transformer-based language generation model that it describes as the largest ever made. This week, Microsoft AI & Research announced Turing NLG, which is twice the size of its nearest competitor.

In the news | WinBuzzer

Microsoft DeepSpeed with Zero Can Train 100 Billion Parameter AI Models 

February 11, 2020

Microsoft has released a new open-source library called DeepSpeed, which, when combined with its ‘ZeRO’ module can train 100 billion parameter models without using the resources traditionally associated with that.

In the news | ITPro

Microsoft unveils ‘largest ever’ AI natural language model 

February 11, 2020

Microsoft has revealed its largest deep learning language model, the Turing Natural Language Generation (T-NLG), which is claimed to have a record-breaking 17 billion parameters. The T-NLG, according to Microsoft, outperforms the largest deep learning models to date: the University of Washington’s Grover-Mega and Nvidia’s MegatronLM, which…

In the news | MSPoweruser

Meet Microsoft DeepSpeed, a new deep learning library that can train massive 100-billion-parameter models 

February 10, 2020

Microsoft Research today announced DeepSpeed, a new deep learning optimization library that can train massive 100-billion-parameter models. In AI, you need to have larger natural language models for better accuracy. But training larger natural language models is time consuming and…

a man wearing a suit and tie

In the news | Future Decoded Mumbai CEO Summit

Satya talks about HAMS in his keynote during his visit to India 

February 10, 2020

In the news | VentureBeat

Microsoft trains world’s largest Transformer language model 

February 10, 2020

Microsoft AI & Research today shared what it calls the largest Transformer-based language generation model ever and open-sourced a deep learning library named DeepSpeed to make distributed training of large models easier.

In the news | InfoWorld

Microsoft speeds up PyTorch with DeepSpeed 

February 10, 2020

Microsoft has released DeepSpeed, a new deep learning optimization library for PyTorch, that is designed to reduce memory use and train models with better parallelism on existing hardware.

Articles

How to build effective human-AI interaction: Considerations for machine learning and software engineering 

February 7, 2020

Besmira Nushi, Saleema Amershi, Ece Kamar, Gagan Bansal, Dan Weld, Mihaela Vorvoreanu, Eric Horvitz To help practitioners design better user-facing AI-based systems, Microsoft recently published a set of guidelines for Human-AI Interaction based on decades of research and validated through…

Awards | National Academy of Engineering

Abigail Sellen elected to National Academy of Engineering 

February 6, 2020

Abigail (Abi) Sellen, Deputy Director and Principal Researcher, was elected as an international member of  the National Academy of Engineering (NAE), “for contributions that ensure consideration of human capabilities in the design of computer systems.” The NAE membership honors those who…

  • Previous
  • 1
  • …
  • 273
  • 274
  • 275
  • 276
  • 277
  • …
  • 570
  • Next