Research Focus: Week of March 4, 2024
In this issue: Generative kaleidoscopic networks; Text diffusion with reinforced conditioning; PRISE – Learning temporal action abstractions as a sequence compression problem.
In this issue: Generative kaleidoscopic networks; Text diffusion with reinforced conditioning; PRISE – Learning temporal action abstractions as a sequence compression problem.
Microsoft’s Orca-Math, a specialized small language model, outperforms much larger models in solving math problems that require multi-step reasoning and shows the potential of using feedback to improve language models. Learn more.
Molecular geometry modeling is a powerful tool for understanding the intricate relationships between molecular structure and biological activity – a field known as structure-activity relationships (SAR). The main premise of SAR is that the biological activity of a molecule is dictated by its specific chemical…
Using LLMs to create structured graphs of image descriptors can enhance the images generated by visual language models. Learn how structured knowledge can improve prompt tuning for both visual and language comprehension.
In this issue: CaaSPER: vertical autoscaling algorithm dynamically maintains optimal CPU utilization; Improved scene landmark detection for camera localization runs faster, uses less storage; ESUS simplifies usability questionnaires for technical products and services.
Editor’s note, Apr. 2, 2024 – Figure 1 was updated to clarify the origin of each source. Perhaps the greatest challenge – and opportunity – of LLMs is extending their powerful capabilities to solve problems beyond the data on which they have been trained, and…
The emergence of large language models (LLMs) has revolutionized the way people create text and interact with computing. However, these models are limited in ensuring the accuracy of the content they generate and enforcing strict compliance with specific formats, such as JSON and other computer…
Research Focus: New Research Forum series explores bold ideas in the era of AI; LASER improves reasoning in language models; Cache-Efficient Top-k Aggregation over High Cardinality Large Datasets; Six Microsoft researchers named 2023 ACM Fellows.
Microsoft Research Forum (opens in new tab) is a new series of conversations that explore recent advances, bold new ideas, and important discussions within the global research community. Leading Microsoft researchers will share insights into their work, followed by live online discussions with audience participants.…
Microsoft announces the AFMR Minority Serving Institutions grant recipients, advancing AI research focused on today’s most significant technical and societal challenges. The grant provides funding and access to Azure-hosted foundation models.
Welcome to Research Focus, a series of blog posts that highlights notable publications, events, code/datasets, new hires and other milestones from across the research community at Microsoft. Join Microsoft Research Forum (opens in new tab) for a continuous exchange of ideas about science and technology…
MetaOpt helps analyze, explain, and improve heuristic performance before deployment in production systems. Learn how it works, particularly in traffic engineering, packet scheduling, and VM placement.
Meet our community of researchers, learn about exciting research topics, and grow your network
Ongoing conversations at the cutting edge of research
Join us for a continuous exchange of ideas about research in the era of general AI