{"id":1057371,"date":"2025-01-17T15:19:21","date_gmt":"2025-01-17T23:19:21","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/"},"modified":"2026-01-21T17:39:05","modified_gmt":"2026-01-22T01:39:05","slug":"physical-ai-research","status":"publish","type":"msr-group","link":"https:\/\/www.microsoft.com\/en-us\/research\/collaboration\/physical-ai-research\/","title":{"rendered":"Physical AI research"},"content":{"rendered":"<section class=\"mb-3 moray-highlight\">\n\t<div class=\"card-img-overlay mx-lg-0\">\n\t\t<div class=\"card-background  has-background- card-background--full-bleed\">\n\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"720\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3.jpg\" class=\"attachment-full size-full\" alt=\"Andrey Kolobov viewing code on a large screen with a robotic hand picking up a box in the background\" style=\"object-position: 70% 77%\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3.jpg 1920w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3-300x113.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3-1024x384.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3-768x288.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3-1536x576.jpg 1536w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3-1600x600.jpg 1600w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Embodied-AI-headerV3-240x90.jpg 240w\" sizes=\"auto, (max-width: 1920px) 100vw, 1920px\" \/>\t\t<\/div>\n\t\t<!-- Foreground -->\n\t\t<div class=\"card-foreground d-flex mt-md-n5 my-lg-5 px-g px-lg-0\">\n\t\t\t<!-- Container -->\n\t\t\t<div class=\"container d-flex mt-md-n5 my-lg-5 \">\n\t\t\t\t<!-- Card wrapper -->\n\t\t\t\t<div class=\"w-100 w-lg-col-5\">\n\t\t\t\t\t<!-- Card -->\n\t\t\t\t\t<div class=\"card material-md-card py-5 px-md-5\">\n\t\t\t\t\t\t<div class=\"card-body \">\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n<h1 class=\"wp-block-heading\" id=\"physical-ai-research\">Physical AI research<\/h1>\n\n\n\n<p>Developing intelligent systems that perform complex tasks through understanding and engaging with physical environments<\/p>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n<div class=\"wp-block-media-text has-vertical-margin-small  has-vertical-padding-none  has-media-on-the-right is-stacked-on-mobile is-vertically-aligned-top\" style=\"grid-template-columns:auto 40%\"><div class=\"wp-block-media-text__content\">\n<h2 class=\"wp-block-heading\" id=\"what-is-physical-ai\">What is physical AI?<\/h2>\n\n\n\n<p>Physical AI systems interact with and learn from the physical world through sensory inputs and actions, using robotic tools to perceive, navigate, and interact with their environment.<\/p>\n\n\n\n<p>Our mission is to accelerate and advance research to develop AI agents that can:\u202f<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Learn by interacting with the real world<\/li>\n\n\n\n<li>Adapt to dynamic environments through trial and error<\/li>\n\n\n\n<li>Transfer knowledge in physical spaces<\/li>\n\n\n\n<li>Translate perceptions into actions for completing everyday tasks like opening doors, picking up objects, or navigating around obstacles<\/li>\n<\/ol>\n<\/div><figure class=\"wp-block-media-text__media\"><video controls src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Action_Selects_Clip_11.mp4\"><\/video><\/figure><\/div>\n\n\n\n<p>Robots are typically machines equipped with actuators designed to execute specific tasks. Physical AI is inherently interdisciplinary, involving robotic control, reinforcement learning, spatial awareness, human-robot interaction, reasoning, and more. Given this complexity, no single organization can cover all aspects of its development alone. We look forward to collaborating with local industry, academia, and institutions across the globe, leveraging their expertise alongside our strengths in AI to advance the field responsibly.<\/p>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<div class=\"wp-block-media-text has-vertical-margin-small  has-vertical-padding-none  is-stacked-on-mobile is-vertically-aligned-top\" style=\"grid-template-columns:35% auto\"><figure class=\"wp-block-media-text__media\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-1024x576.jpg\" alt=\"Rho-Alpha | close up image of a robotic arm and hand holding down a power outlet strip while unplugging a small charging adapter.\" class=\"wp-image-1160627 size-full\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/Rho-Alpha_feature_1400x788.jpg 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><div class=\"wp-block-media-text__content\">\n<h3 class=\"wp-block-heading\" id=\"advancing-ai-for-the-physical-world\">Advancing AI for the physical world<\/h3>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--1\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/story\/advancing-ai-for-the-physical-world\/\">Read the Rho-Alpha story<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--2\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/benchmarking-affordance-generalization-with-busybox\/\">Read the BusyBox paper<\/a><\/div>\n<\/div>\n\n\n\n<blockquote class=\"wp-block-quote is-style-spectrum--blue-green is-layout-flow wp-block-quote-is-layout-flow\">\n<p>&#8220;The emergence of vision-language-action (VLA) models for physical systems is enabling systems to perceive, reason, and act with increasing autonomy alongside humans in environments that are far less structured.&#8221;<\/p>\n<cite>&#8211; Ashley Llorens, CVP & Managing Director Microsoft Research Accelerator<\/cite><\/blockquote>\n<\/div><\/div>\n\n\n\n<div class=\"wp-block-media-text has-vertical-margin-small  has-vertical-padding-none  has-media-on-the-right is-stacked-on-mobile is-style-border\" style=\"grid-template-columns:auto 40%\"><div class=\"wp-block-media-text__content\">\n<h3 class=\"wp-block-heading\" id=\"advancing-physical-ai\">Advancing physical AI<\/h3>\n\n\n\n<p>We are enabling AI systems that integrate perception, reasoning, and control, enabling adaptive and autonomous interaction in dynamic environments.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--3\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/collaboration\/embodied-ai\/research-in-action\/\">Research in action gallery<\/a><\/div>\n<\/div>\n<\/div><figure class=\"wp-block-media-text__media\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-1024x576.jpg\" alt=\"Researcher working on a computer screen with two robotic arms suspended from a frame with a large screen of code behind\" class=\"wp-image-1129608 size-full\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5193_robotic-arms-code.jpg 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/div>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n\n\n<h2 class=\"wp-block-heading\" id=\"advancing-intelligent-embodiments\">Advancing intelligent embodiments<\/h2>\n\n\n\n<p>We are enabling AI systems that integrate perception, reasoning, and control, enabling adaptive and autonomous interaction in dynamic environments.<\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-4 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" data-id=\"1129548\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics.jpg\" alt=\"photo of three researchers collaborating on a bimanual coordination in front of two robotic arms suspended from a frame in front of a screen of code\" class=\"wp-image-1129548\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2222-men-desk-robotics-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><figcaption class=\"wp-element-caption\">Our Researchers collaborate on a bimanual coordination, analyzing real-time data and visual feedback to refine AI algorithms.<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" data-id=\"1129542\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table.jpg\" alt=\"Photo of a woman sitting at a Microsoft table in front of a computer screen while demonstrating a robotic arm picking up a glass\" class=\"wp-image-1129542\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-657B2161-Microsoft-table-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><figcaption class=\"wp-element-caption\">Demonstrating a collaborative robotic arm in action, highlighting real-time AI integration for advanced control and planning at Microsoft Research.<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"933\" data-id=\"1129551\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5384-robotic-hand-and-poster.jpg\" alt=\"Photo of a robotic arm grasping an aluminum can on a desk; there is a presentation poster standing at the back edge of the desk\" class=\"wp-image-1129551\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5384-robotic-hand-and-poster.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5384-robotic-hand-and-poster-300x200.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5384-robotic-hand-and-poster-1024x682.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5384-robotic-hand-and-poster-768x512.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5384-robotic-hand-and-poster-240x160.jpg 240w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><figcaption class=\"wp-element-caption\">Leveraging transformer models on Seedonoid for interactive object handling, showcasing Microsoft Research\u2019s advancements in AI-guided robotics.<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"933\" data-id=\"1129539\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5322-group-discussion.jpg\" alt=\"Andrey Kolobov discussing a demonstration with the woman presenting it\" class=\"wp-image-1129539\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5322-group-discussion.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5322-group-discussion-300x200.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5322-group-discussion-1024x682.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5322-group-discussion-768x512.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5322-group-discussion-240x160.jpg 240w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" data-id=\"1129554\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group.jpg\" alt=\"Ashley Llorens and three other men standing in a room\" class=\"wp-image-1129554\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5357-Ashley-group-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"933\" data-id=\"1129545\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5278-Andrey-robotic-hand.jpg\" alt=\"Andrey Kolobov manipulating a robotic hand that is reaching for a box of Jell-O\" class=\"wp-image-1129545\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5278-Andrey-robotic-hand.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5278-Andrey-robotic-hand-300x200.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5278-Andrey-robotic-hand-1024x682.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5278-Andrey-robotic-hand-768x512.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/EmbodiedAI-5278-Andrey-robotic-hand-240x160.jpg 240w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><figcaption class=\"wp-element-caption\">A dexterous robotic hand that leverages foundation models and reinforcement learning for precise object manipulation.<\/figcaption><\/figure>\n<\/figure>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n\n\n<p>Embodied AI research involves many different groups at Microsoft, including our partners in product development. All are essential in our ongoing research efforts.<\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"292\" height=\"292\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/pollefeys-marc_292x292.jpg\" alt=\"headshot of Marc Pollefeys, Lab Director, Mixed Reality and AI Zurich\" class=\"wp-image-608154\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/pollefeys-marc_292x292.jpg 292w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/pollefeys-marc_292x292-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/pollefeys-marc_292x292-180x180.jpg 180w\" sizes=\"auto, (max-width: 292px) 100vw, 292px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/mapoll\/\">Marc Pollefeys<\/a>, Director, Spatial AI Zurich Lab<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Roy-Nejabi_360x360.jpg\" alt=\"Roy Nejabi\" class=\"wp-image-1125255\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Roy-Nejabi_360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Roy-Nejabi_360x360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Roy-Nejabi_360x360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Roy-Nejabi_360x360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/rnejabi\/\">Roy Nejabi<\/a>, Principal Program Manager, Spatial AI Zurich Lab<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"397\" height=\"397\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/\u7121\u984c_Katsushi-Ikeuchi_headshot.png\" alt=\"Katsushi Ikeuchi\" class=\"wp-image-1125261\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/\u7121\u984c_Katsushi-Ikeuchi_headshot.png 397w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/\u7121\u984c_Katsushi-Ikeuchi_headshot-300x300.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/\u7121\u984c_Katsushi-Ikeuchi_headshot-150x150.png 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/\u7121\u984c_Katsushi-Ikeuchi_headshot-180x180.png 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/\u7121\u984c_Katsushi-Ikeuchi_headshot-360x360.png 360w\" sizes=\"auto, (max-width: 397px) 100vw, 397px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/katsuike\/\">Katsushi Ikeuchi<\/a>, Senior Principal Research Manager, Applied Robotics Research<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Naoki-Wake_360x360.jpg\" alt=\"Naoki Wake\" class=\"wp-image-1125279\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Naoki-Wake_360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Naoki-Wake_360x360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Naoki-Wake_360x360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Naoki-Wake_360x360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/nawake\/\">Naoki Wake<\/a>, Senior Researcher, Applied Robotics Research<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/10\/Jun-Takamatsu_360.jpg\" alt=\"headshot of Jun Takamatsu\" class=\"wp-image-891591\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/10\/Jun-Takamatsu_360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/10\/Jun-Takamatsu_360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/10\/Jun-Takamatsu_360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/10\/Jun-Takamatsu_360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/takamatsujun\/\">Jun Takamatsu<\/a>, Senior Researcher, Applied Robotics Research<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"300\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Kazuhiro-Sasabuchi.jpg\" alt=\"Kazuhiro Sasabuchi\" class=\"wp-image-1127166\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Kazuhiro-Sasabuchi.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Kazuhiro-Sasabuchi-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Kazuhiro-Sasabuchi-180x180.jpg 180w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/ieeexplore.ieee.org\/author\/37085680164\" target=\"_blank\" rel=\"noreferrer noopener\">Kazuhiro Sasabuchi<\/a>, Senior Research Scientist, Applied Robotics Research<\/figcaption><\/figure>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Lukas-Gruber_360x360.jpg\" alt=\"Lukas Gruber\" class=\"wp-image-1129155\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Lukas-Gruber_360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Lukas-Gruber_360x360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Lukas-Gruber_360x360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Lukas-Gruber_360x360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/lugruber\/\">Lukas Gruber<\/a>, Principal Scientist, Spatial AI Zurich Lab<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Jeffrey-Delmerico_360x360.jpg\" alt=\"Jeffrey Delmerico\" class=\"wp-image-1129158\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Jeffrey-Delmerico_360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Jeffrey-Delmerico_360x360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Jeffrey-Delmerico_360x360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Jeffrey-Delmerico_360x360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/jedelmer\/\">Jeffrey Delmerico<\/a>, Senior Scientist, Spatial AI Zurich Lab<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Isar-Meijer_360x360.jpg\" alt=\"Isar Meijer\" class=\"wp-image-1129161\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Isar-Meijer_360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Isar-Meijer_360x360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Isar-Meijer_360x360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/01\/Isar-Meijer_360x360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/isarmeijer\/\">Isar Meijer<\/a>, Software Engineer II, Spatial AI Zurich Lab<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"341\" height=\"341\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Timothy-H-Chung.jpg\" alt=\"Tim Chung\" class=\"wp-image-1129509\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Timothy-H-Chung.jpg 341w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Timothy-H-Chung-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Timothy-H-Chung-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Timothy-H-Chung-180x180.jpg 180w\" sizes=\"auto, (max-width: 341px) 100vw, 341px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.linkedin.com\/in\/timothy-h-chung\/\" target=\"_blank\" rel=\"noreferrer noopener\">Tim Chung<\/a>, GM Robotics SMT<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Mark-Stevens_360x360.jpg\" alt=\"Mark Stevens\" class=\"wp-image-1131096\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Mark-Stevens_360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Mark-Stevens_360x360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Mark-Stevens_360x360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Mark-Stevens_360x360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.linkedin.com\/in\/mark-s-8974998\/\" target=\"_blank\" rel=\"noreferrer noopener\">Mark Stevens<\/a>, Principal Program Manager, SMT<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Dan-Rosenstein_360x360.jpg\" alt=\"Dan Rosenstein\" class=\"wp-image-1131099\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Dan-Rosenstein_360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Dan-Rosenstein_360x360-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Dan-Rosenstein_360x360-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/02\/Dan-Rosenstein_360x360-180x180.jpg 180w\" sizes=\"auto, (max-width: 360px) 100vw, 360px\" \/><figcaption class=\"wp-element-caption\"><a href=\"https:\/\/www.linkedin.com\/in\/daniel-rosenstein-8a87803\/\" target=\"_blank\" rel=\"noreferrer noopener\">Dan Rosenstein<\/a>, Group Product Manager, SMT<\/figcaption><\/figure>\n<\/div>\n<\/div>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Developing intelligent systems that perform complex tasks through understanding and engaging with physical environments Physical AI systems interact with and learn from the physical world through sensory inputs and actions, using robotic tools to perceive, navigate, and interact with their environment. Our mission is to accelerate and advance research to develop AI agents that can:\u202f [&hellip;]<\/p>\n","protected":false},"featured_media":1129275,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_group_start":"","footnotes":""},"research-area":[13556],"msr-group-type":[243721],"msr-locale":[268875],"msr-impact-theme":[266208],"class_list":["post-1057371","msr-group","type-msr-group","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-group-type-collaboration","msr-locale-en_us"],"msr_group_start":"","msr_detailed_description":"","msr_further_details":"","msr_hero_images":[],"msr_research_lab":[],"related-researchers":[{"type":"user_nicename","display_name":"Vivan Amin","user_id":43431,"people_section":"Core team","alias":"aminvivan"},{"type":"user_nicename","display_name":"Ade Famoti","user_id":43005,"people_section":"Core team","alias":"adfamoti"},{"type":"user_nicename","display_name":"Jianlong Fu","user_id":32260,"people_section":"Core team","alias":"jianf"},{"type":"user_nicename","display_name":"Jianfeng Gao","user_id":32246,"people_section":"Core team","alias":"jfgao"},{"type":"user_nicename","display_name":"Baining Guo","user_id":31169,"people_section":"Core team","alias":"bainguo"},{"type":"user_nicename","display_name":"Dongqi Han","user_id":42384,"people_section":"Core team","alias":"dongqihan"},{"type":"guest","display_name":"Mawo Kamakura","user_id":823414,"people_section":"Core team","alias":""},{"type":"user_nicename","display_name":"Andrey Kolobov","user_id":30910,"people_section":"Core team","alias":"akolobov"},{"type":"user_nicename","display_name":"John Langford","user_id":32204,"people_section":"Core team","alias":"jcl"},{"type":"user_nicename","display_name":"Lars Liden","user_id":32612,"people_section":"Core team","alias":"laliden"},{"type":"user_nicename","display_name":"Ashley Llorens","user_id":39964,"people_section":"Core team","alias":"allorens"},{"type":"user_nicename","display_name":"Yasuyuki Matsushita","user_id":43707,"people_section":"Core team","alias":"yamatsushita"},{"type":"user_nicename","display_name":"Galen Mullins","user_id":43846,"people_section":"Core team","alias":"galenmullins"},{"type":"user_nicename","display_name":"Namiko Saito","user_id":43853,"people_section":"Core team","alias":"namikosaito"},{"type":"user_nicename","display_name":"Swadheen Shukla","user_id":38248,"people_section":"Core team","alias":"swads"},{"type":"user_nicename","display_name":"Lily Sun","user_id":32703,"people_section":"Core team","alias":"lisu"},{"type":"guest","display_name":"Reuben Tan","user_id":1112373,"people_section":"Core team","alias":""},{"type":"user_nicename","display_name":"Andrea Tupini","user_id":40339,"people_section":"Core team","alias":"andreatupini"},{"type":"user_nicename","display_name":"Yu Wang","user_id":40783,"people_section":"Core team","alias":"yuwan"},{"type":"user_nicename","display_name":"Qianhui Wu","user_id":40741,"people_section":"Core team","alias":"qianhuiwu"},{"type":"user_nicename","display_name":"Jiaolong Yang","user_id":36125,"people_section":"Core team","alias":"jiaoyan"},{"type":"user_nicename","display_name":"Yizhong Zhang","user_id":41200,"people_section":"Core team","alias":"yizzhan"},{"type":"user_nicename","display_name":"Li Zhao","user_id":36152,"people_section":"Core team","alias":"lizo"},{"type":"guest","display_name":"Timothy H Chung","user_id":1132491,"people_section":"Collaborators","alias":""},{"type":"user_nicename","display_name":"Jeffrey Delmerico","user_id":38562,"people_section":"Collaborators","alias":"jedelmer"},{"type":"user_nicename","display_name":"Lukas Gruber","user_id":38565,"people_section":"Collaborators","alias":"lugruber"},{"type":"user_nicename","display_name":"Isar Meijer","user_id":43734,"people_section":"Collaborators","alias":"isarmeijer"},{"type":"user_nicename","display_name":"Roy Nejabi","user_id":42276,"people_section":"Collaborators","alias":"rnejabi"},{"type":"user_nicename","display_name":"Marc Pollefeys","user_id":36191,"people_section":"Collaborators","alias":"mapoll"},{"type":"guest","display_name":"Daniel Rosenstein","user_id":1132485,"people_section":"Collaborators","alias":""},{"type":"guest","display_name":"Kazuhiro Sasabuchi","user_id":821887,"people_section":"Collaborators","alias":""},{"type":"guest","display_name":"Mark Stevens","user_id":1132488,"people_section":"Collaborators","alias":""}],"related-publications":[1024734,878910,887205,890103,890160,893898,894018,921144,934128,956046,959733,966690,976032,981240,1008411,1021290,870993,1031094,1054221,1095039,1097973,1106502,1106511,1125126,1125132,1125138,1125147,1131801,1133836,1133840,1160604,759781,556701,557466,608856,636900,640416,666612,668262,676473,676920,720403,725098,727504,737221,752056,754918,161046,772627,774328,778030,821614,821692,821698,821713,821740,821920,825145,830602,830740,832453,863139],"related-downloads":[],"related-videos":[1097295,1125153,1133379],"related-projects":[821527,727042],"related-events":[1139371,1035321],"related-opportunities":[],"related-posts":[1122837,1131576],"tab-content":[],"msr_impact_theme":["Discovery"],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/1057371","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-group"}],"version-history":[{"count":49,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/1057371\/revisions"}],"predecessor-version":[{"id":1160764,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/1057371\/revisions\/1160764"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/1129275"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=1057371"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=1057371"},{"taxonomy":"msr-group-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group-type?post=1057371"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=1057371"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=1057371"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}