Microsoft Research Blog

Archai can design your neural network with state-of-the-art neural architecture search (NAS) 

October 1, 2020 | Shital Shah and Debadeepta Dey
The goal of neural architecture search (NAS) (opens in new tab) is to have computers automatically search for the best-performing neural networks. Recent advances in NAS methods have made it possible to build problem-specific networks that are faster, more compact, and less power hungry than…

Recent Posts

  1. CodeXGLUE: A benchmark dataset and open challenge for code intelligence 

    September 29, 2020

    According to Evans Data Corporation (opens in new tab), there are 23.9 million professional developers in 2019, and the population is expected to reach 28.7 million in 2024. With the growing population of developers, code intelligence, which aims to leverage AI to help software developers…

  2. Project InnerEye open-source deep learning toolkit: Democratizing medical imaging AI 

    September 22, 2020 | Javier Alvarez-Valle and Gregory J. Moore, MD, PhD

    For over a decade, the Project InnerEye team at Microsoft Research Cambridge has been developing state-of-the-art machine learning methods for the automatic, quantitative analysis of three-dimensional medical images. An important application is to assist clinicians for image preparation and planning tasks for radiotherapy cancer treatment…

  3. a screenshot of a cell phone

    Dialogue as Dataflow: A new approach to conversational AI 

    September 21, 2020

    By the Semantic Machines research team “Easier said than done.” These four words reflect the promise of conversational AI. It takes just seconds to ask When are Megan and I both free? but much longer to find out manually from a calendar. Indeed, almost everything…

  4. DeepSpeed: Extreme-scale model training for everyone 

    September 10, 2020 | DeepSpeed Team, Rangan Majumder, and Junhua Wang

    In February, we announced DeepSpeed, an open-source deep learning training optimization library, and ZeRO (Zero Redundancy Optimizer), a novel memory optimization technology in the library, which vastly advances large model training by improving scale, speed, cost, and usability. DeepSpeed has enabled researchers to create Turing…

Explore More

  • Events & conferences

    Events & conferences 

    Meet our community of researchers, learn about exciting research topics, and grow your network

  • Podcasts

    Podcasts 

    Ongoing conversations at the cutting edge of research

  • Microsoft Research Forum

    Microsoft Research Forum 

    Join us for a continuous exchange of ideas about research in the era of general AI