新闻与深度文章
加载中…
新闻报道 | New York Times
You Can’t Spell Creative Without A.I.
Advances in software applications that p…
Rangan Majumder is the Partner Group Pro…
| Corby Rosset
This figure was adapted from a similar i…
微软研究院博客
ZeRO & DeepSpeed: New system optimizations enable training models with over 100 billion parameters
| DeepSpeed Team, Rangan Majumder, 和 Junhua Wang
The latest trend in AI is that larger na…
新闻报道 | Fortune
A.I. and tackling the risk of “digital redlining”
Last week, a Dutch court ordered the gov…
Transformer-based language generation mo…
新闻报道 | WinBuzzer
Microsoft’s New Turing NLG is the Largest Transformer Language Model
Microsoft has developed a Transformer-ba…
Microsoft has released a new open-source…
Microsoft has revealed its largest deep …