Microsoft Research Blog

  1. graphical user interface, text, application, email

    The science behind semantic search: How AI from Bing is powering Azure Cognitive Search

    Azure Cognitive Search is a cloud search service that gives developers APIs and tools to build rich search experiences over private, heterogeneous content in web, mobile, and enterprise applications. It has multiple components, including an API for indexing and querying, seamless integration through Azure data ingestion, deep integration with Azure Cognitive Services, and persistent storage of user-owned indexed content. At…
    March 2, 2021
  2. AAAI 2021: Accelerating the impact of artificial intelligence

    The purpose of the Association for the Advancement of Artificial Intelligence, according to its bylaws, is twofold. The first is to promote research in the area of AI, and the second is to promote the responsible use of these types of technology. The result was a 35th AAAI Conference on Artificial Intelligence (AAAI-21) schedule that broadens the possibilities of AI and is heavily reflective of a…
    February 24, 2021

Microsoft Research

Webinar Series

Online lectures from Microsoft’s computer scientists

View All Webinars

Microsoft Research


Ongoing conversations at the cutting edge of research

View All Episodes
  1. Suphx - Mahjong board

    Research Collection – Shall we play a game?

    From a research point of view, games offer an amazing environment in which to develop new machine learning algorithms and techniques. And we hope, in due course, that those new algorithms will feed back not just into gaming, but into many other domains. Beyond the…
    February 5, 2021
  2. LAMBDA graphic

    LAMBDA: The ultimate Excel worksheet function

    Ever since it was released in the 1980s, Microsoft Excel has changed how people organize, analyze, and visualize their data, providing a basis for decision-making for the millions of people who use it each day. It’s also the world’s most widely used programming language. Excel…
    January 25, 2021 by Andy Gordon and Simon Peyton Jones
  3. diagram

    Three mysteries in deep learning: Ensemble, knowledge distillation, and self-distillation

    Under now-standard techniques, such as over-parameterization, batch-normalization, and adding residual links, “modern age” neural network training—at least for image classification tasks and many others—is usually quite stable. Using standard neural network architectures and training algorithms (typically SGD with momentum), the learned models perform consistently well,…
    January 19, 2021 by Zeyuan Allen-Zhu and Yuanzhi Li