Microsoft Research Blog

English

  1. Scaling LLM Test-Time Compute with Mobile NPU on Smartphones 

    November 1, 2025

    Deploying Large Language Models (LLMs) on mobile devices faces the challenge of insufficient performance in smaller models and excessive resource consumption in larger ones. This paper highlights that mobile Neural Processing Units (NPUs) have underutilized computational resources, particularly their matrix multiplication units, during typical LLM…

  2. Software Managed Networks via Coarsening 

    November 1, 2025

    We propose moving from Software Defined Networks (SDN) to Software Managed Networks (SMN) where all information for managing the life cycle of a network (from deployment to operations to upgrades), across all layers (from Layer 1 through 7) is stored in a central repository. Crucially,…

  3. Jailbreak Distillation: Renewable Safety Benchmarking 

    November 1, 2025

    Large language models (LLMs) are rapidly deployed in critical applications, raising urgent needs for robust safety benchmarking. We propose Jailbreak Distillation (JBDistill), a novel benchmark construction framework that"distills"jailbreak attacks into high-quality and easily-updatable safety benchmarks. JBDistill utilizes a small set of development models and existing…

  4. BioAgents: Bridging the gap in bioinformatics analysis with multi-agent systems 

    November 1, 2025

    Developing end-to-end bioinformatics workflows is challenging, demanding deep expertise in both genomics and computational techniques. While large language models (LLMs) provide some assistance, they often lack the nuanced guidance required for complex bioinformatics tasks, and are resource-intensive. We thus propose a multi-agent system built on…

  5. Knowledge‐Guided Machine Learning for Operational Flood Forecasting 

    October 31, 2025

    We present a knowledge‐guided machine learning framework for operational hydrologic forecasting at the catchment scale. Our approach, a Factorized Hierarchical Neural Network (FHNN), has two main components: inverse and forward models. The inverse model uses observed precipitation, temperature, and streamflow data to generate a representation…

  6. AI diffusion: UAE and Singapore lead the way, Microsoft says 

    October 31, 2025

    The UAE, Singapore, Norway, Ireland and France received some of the highest scores for AI adoption, a report from Microsoft's AI Economy Institute has found. Microsoft spotlights the UAE and Singapore for “leading in AI use among working-age adults, reflecting their long-term investment in digital connectivity and skills”.

  7. AIEI Banner - globe abstract

    AI Economy Institute​ 

    October 30, 2025 | Ursula Hardy

    The AI Economy Institute (AIEI) is Microsoft’s flagship think tank dedicated to shaping an inclusive, trustworthy AI economy. We building a network of scholars and convening that network with our subject matter experts to explore how artificial intelligence is transforming work, education, and productivity –…

  8. Future Forward Live – AOC, Part 1 

    October 30, 2025 | Francesca Parmigiani

    Francesca Parmigiani, Principal Research Manager at Microsoft Research Cambridge (@MSFTResearch), on the moment her team realized they were onto something big. “We ran a small version of these banking problems on the actual hardware and saw great accuracy. That was the moment I thought: wow,…

  9. Magentic Marketplace: An Open-Source Environment for Studying Agentic Markets 

    October 30, 2025

    As LLM agents advance, they are increasingly mediating economic decisions, ranging from prod- uct discovery to transactions, on behalf of users. Such applications promise benefits but also raise many questions about agent accountability and value for users. Addressing these questions requires understanding how agents behave…

  10. Future Forward Live – AOC, Part 2 

    October 30, 2025 | Francesca Parmigiani

    What does an analog optical computer do? Francesca Parmigiani, Principal Research Manager at Microsoft Research Cambridge (@Microsoft), has the answer. “When you start learning a new programming language, you begin with a ‘Hello World.’ For us, that meant recognizing handwritten digits from 0 to 9.”