{"id":602418,"date":"2025-09-11T06:31:54","date_gmt":"2025-09-11T13:31:54","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-lab&#038;p=602418"},"modified":"2025-09-17T08:09:49","modified_gmt":"2025-09-17T15:09:49","slug":"spatial-ai-zurich","status":"publish","type":"msr-research-lab","link":"https:\/\/www.microsoft.com\/en-us\/research\/lab\/spatial-ai-zurich\/","title":{"rendered":"Spatial AI Lab \u2013 Zurich"},"content":{"rendered":"\n<div style=\"padding-bottom:64px; padding-top:64px\" class=\"wp-block-msr-immersive-section alignfull row wp-block-msr-immersive-section\">\n\t\n\t<div class=\"container\">\n\t\t<div class=\"wp-block-msr-immersive-section__wrapper\">\n\t\t\t<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"367\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1-e1757670548979-1024x367.png\" alt=\"test\" class=\"wp-image-1149585\" style=\"width:1600px;height:auto\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1-e1757670548979-1024x367.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1-e1757670548979-300x107.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1-e1757670548979-768x275.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1-e1757670548979-240x86.png 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1-e1757670548979.png 1536w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\t\t<\/div>\n\t<\/div>\n\n\t<\/div>\n\n\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>The Spatial AI Lab&nbsp;is&nbsp;part of&nbsp;the&nbsp;<a href=\"https:\/\/www.microsoft.com\/applied-sciences?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener\">Applied Sciences Group<\/a>,&nbsp;a&nbsp;Microsoft&nbsp;research and development&nbsp;organization&nbsp;dedicated to creating&nbsp;next-generation human-computer interaction technologies&nbsp;leveraging&nbsp;the most&nbsp;recent&nbsp;AI developments and exploring&nbsp;new hardware&nbsp;capabilities&nbsp;and&nbsp;device&nbsp;form-factors.&nbsp;&nbsp;<\/strong><\/p>\n\n\n\n<p><strong>Our team of scientists and engineers&nbsp;has&nbsp;strong&nbsp;expertise&nbsp;in computer vision&nbsp;and&nbsp;multi-modal&nbsp;AI,&nbsp;with a particular focus on spatial and&nbsp;embodied&nbsp;AI. We&nbsp;work&nbsp;on integrating&nbsp;AI capabilities in Microsoft&nbsp;products, ranging from new&nbsp;AI features in Windows applications,&nbsp;over core AI developments for the Windows&nbsp;platform,&nbsp;to&nbsp;exploring wearable form-factors and autonomous agents. &nbsp;&nbsp;<\/strong><\/p>\n\n\n\n<p><strong>Founded in 2018&nbsp;in Zurich, our lab is led by Marc Pollefeys, Professor of Computer Science at&nbsp;ETH Zurich,&nbsp;and serves as&nbsp;a hub for&nbsp;a&nbsp;strategic&nbsp;partnership&nbsp;between&nbsp;Microsoft and ETH.&nbsp;<\/strong><\/p>\n<\/blockquote>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"683\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1024x683.png\" alt=\"Designer\" class=\"wp-image-1149513\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-1024x683.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-300x200.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-768x512.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1-240x160.png 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Designer-1.png 1536w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<h1 class=\"wp-block-heading has-text-align-center\" id=\"windows-and-ai-1\">Windows and AI<\/h1>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>Our team&nbsp;helps&nbsp;develop&nbsp;multimodal foundation models,&nbsp;multimodal embeddings and generative AI models&nbsp;and&nbsp;incorporate this in&nbsp;Microsoft&nbsp;applications. Our&nbsp;expertise&nbsp;includes&nbsp;image and video&nbsp;analysis and&nbsp;generation,&nbsp;scalable training infrastructure, the training of models and deployment in quantized form to edge devices&nbsp;like&nbsp;<a href=\"https:\/\/www.microsoft.com\/en-gb\/windows\/copilot-plus-pcs#npu\" target=\"_blank\" rel=\"noreferrer noopener\">Copilot+ PCs<\/a>.&nbsp;&nbsp;&nbsp;<\/strong><\/p>\n<\/blockquote>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<h1 class=\"wp-block-heading has-text-align-center\" id=\"spatial-and-embodied-ai\">Spatial and Embodied AI<\/h1>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>By integrating spatial and physical awareness into foundational models, we equip AI to understand and interact effectively with the real world,&nbsp;helping&nbsp;Copilot&nbsp;answer questions about the physical&nbsp;world&nbsp;but also enable embodied agents&nbsp;and robots&nbsp;to&nbsp;performs tasks.<\/strong><\/p>\n<\/blockquote>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"683\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Thumbnail-Image-1-1024x683.png\" alt=\"robot\" class=\"wp-image-1149543\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Thumbnail-Image-1-1024x683.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Thumbnail-Image-1-300x200.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Thumbnail-Image-1-768x512.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Thumbnail-Image-1-240x160.png 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Thumbnail-Image-1.png 1536w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"612\" height=\"355\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Screenshot-2025-09-10-130001-1.png\" alt=\"it \" class=\"wp-image-1149517\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Screenshot-2025-09-10-130001-1.png 612w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Screenshot-2025-09-10-130001-1-300x174.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Screenshot-2025-09-10-130001-1-240x139.png 240w\" sizes=\"auto, (max-width: 612px) 100vw, 612px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-1024x576.jpg\" alt=\"earth\" class=\"wp-image-1149521\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-1536x864.jpg 1536w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/Minecraft-Earth_Key-Art-Hero-3.jpg 1920w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<h1 class=\"wp-block-heading has-text-align-center\" id=\"mixed-reality-hololens-vps\">Mixed Reality&nbsp;\u2013&nbsp;HoloLens&nbsp;& VPS<\/h1>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>Our team&nbsp;made&nbsp;fundamental&nbsp;contributions to Microsoft Mixed Reality and&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.youtube.com\/watch?v=eqFqtAJMtYE\" target=\"_blank\" rel=\"noopener noreferrer\">HoloLens<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;ranging from&nbsp;object anchoring&nbsp;to&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/learn.microsoft.com\/en-gb\/hololens\/hololens2-moving-platform\" target=\"_blank\" rel=\"noopener noreferrer\">Moving Platform<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.&nbsp;We&nbsp;also&nbsp;co-developed&nbsp;Microsoft&#8217;s first&nbsp;cloud&nbsp;visual positioning system (VPS) for AR, known as&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/azure-int.microsoft.com\/en-us\/products\/spatial-anchors\/#features\" target=\"_blank\" rel=\"noopener noreferrer\">Azure Spatial Anchors<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.&nbsp; Designed to extend HoloLens\u2019s mapping and localization capabilities&nbsp;it&nbsp;enabled&nbsp;shared experiences across multiple devices&nbsp;targeting&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.youtube.com\/watch?v=Hai7zZYX8D0\" target=\"_blank\" rel=\"noopener noreferrer\">industrial scenarios<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.&nbsp;<\/strong><\/p>\n<\/blockquote>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>The&nbsp;localization&nbsp;service&nbsp;also&nbsp;powered&nbsp;massively&nbsp;multiplayer&nbsp;outdoor&nbsp;AR phone games such as&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.youtube.com\/watch?v=dYKxBKj29dI&t=1s\" target=\"_blank\" rel=\"noopener noreferrer\">Minecraft Earth<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.&nbsp;Today, the&nbsp;VPS&nbsp;system continues to support our Spatial AI and Robotics research.&nbsp;<\/strong><\/p>\n<\/blockquote>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<h1 class=\"wp-block-heading has-text-align-center\" id=\"collaborations-microsoft-eth\">Collaborations\/Microsoft & ETH<\/h1>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>Our lab&nbsp;maintains&nbsp;a strategic partnership with&nbsp;ETH Zurich.&nbsp;&nbsp;We work particularly closely with the&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/cvg.ethz.ch\/\">Computer Vision and Geometry (CVG) group<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. It provides ETH PhD, Master\u2019s, and&nbsp;Bachelor\u2019s&nbsp;students with the opportunity to work directly with&nbsp;Microsoft, gaining hands-on experience, contributing to diverse projects, and&nbsp;benefiting&nbsp;from mentorship.<\/strong>&nbsp;<\/p>\n<\/blockquote>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"549\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/DSC0053-1024x549.jpg\" alt=\"a group of people posing for a photo\" class=\"wp-image-1149525\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/DSC0053-1024x549.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/DSC0053-300x161.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/DSC0053-768x412.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/DSC0053-710x380.jpg 710w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/DSC0053-240x129.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/DSC0053.jpg 1200w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>\n<\/div>\n\n\n\n\n<footer class=\"single-lab__footer bg-gray-100 py-4 py-md-5\">\n\t<div class=\"container\">\n\t\t<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:50%\">\n<h3 class=\"wp-block-heading\" id=\"leadership\">Leadership<\/h3>\n\n\n\n<div class=\"single-lab__block-footer-section\">\n\t<ul class=\"m-0 list-unstyled\">\n\t\t\t\t\t\t\t\t<li class=\"mb-0\">\n\t\t\t\t<div class=\"single-lab__footer-leader d-flex mb-2\">\n\t\t\t\t\t\t\t\t\t\t\t<div class=\"single-lab__footer-leader-avatar\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/mapoll\/\">\n\t\t\t\t\t\t\t\t\t<div class=\"embed-responsive embed-responsive-1by1 rounded-circle\">\n\t\t\t\t\t\t\t\t\t\t<img alt='Portrait of Marc Pollefeys' src='\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/Marc-1-180x180.jpg' srcset='\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/11\/Marc-1-360x360.jpg 2x' class='avatar avatar-180 photo avatar m-0 img-object-cover embed-responsive-item' height='180' width='180' \/>\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"single-lab__footer-leader-info\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/mapoll\/\">\n\t\t\t\t\t\t\t\tMarc Pollefeys\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<p>Partner Director of Science<\/p>\n\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/li>\n\t\t\t<\/ul>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<h3 class=\"wp-block-heading\" id=\"address\">Address<\/h3>\n\n\n<div class=\"single-lab__block-footer-section\">\n\t<div class=\"d-flex\">\n\t\t<span class=\"glyph-prepend glyph-prepend-xsmall glyph-prepend-map-pin\"><\/span>\n\t\t<div class=\"ml-2\">\n\t\t\t<address class=\"m-0\">\n\t\t\t\t\t\t\t\t\tSeestrasse 356,\n\t\t\t\t\t<br\/>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tZurich\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t8038\t\t\t\t\t\t\t\t<br\/>\n\t\t\t\t\t\t\t\t\tSwitzerland\t\t\t\t\t\t\t<\/address>\n\t\t<\/div>\n\t<\/div>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<h3 class=\"wp-block-heading is-style-default\" id=\"contact\">Contact<\/h3>\n\n\n\n<div class=\"single-lab__block-footer-section\">\n\t\t\t<ul class=\"wp-block-social-links has-visible-labels is-vertical is-layout-flex wp-container-core-social-links-is-layout-8cf370e7 wp-block-social-links-is-layout-flex\"><li class=\"wp-social-link wp-social-link-facebook  wp-block-social-link\"><a href=\"https:\/\/www.facebook.com\/microsoftresearch\" class=\"wp-block-social-link-anchor\"><svg width=\"24\" height=\"24\" viewBox=\"0 0 24 24\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M12 2C6.5 2 2 6.5 2 12c0 5 3.7 9.1 8.4 9.9v-7H7.9V12h2.5V9.8c0-2.5 1.5-3.9 3.8-3.9 1.1 0 2.2.2 2.2.2v2.5h-1.3c-1.2 0-1.6.8-1.6 1.6V12h2.8l-.4 2.9h-2.3v7C18.3 21.1 22 17 22 12c0-5.5-4.5-10-10-10z\"><\/path><\/svg><span class=\"wp-block-social-link-label\">Facebook<\/span><\/a><\/li>\n\n<li class=\"wp-social-link wp-social-link-x  wp-block-social-link\"><a href=\"https:\/\/x.com\/MSFTResearch\" class=\"wp-block-social-link-anchor\"><svg width=\"24\" height=\"24\" viewBox=\"0 0 24 24\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M13.982 10.622 20.54 3h-1.554l-5.693 6.618L8.745 3H3.5l6.876 10.007L3.5 21h1.554l6.012-6.989L15.868 21h5.245l-7.131-10.378Zm-2.128 2.474-.697-.997-5.543-7.93H8l4.474 6.4.697.996 5.815 8.318h-2.387l-4.745-6.787Z\" \/><\/svg><span class=\"wp-block-social-link-label\">X<\/span><\/a><\/li><\/ul><\/div>\n<\/div>\n<\/div>\t<\/div>\n<\/footer>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_email":"","msr_address_1":"Seestrasse 356","msr_address_2":"","msr_city":"Zurich","msr_state":"","msr_postal_code":"8038","msr_country":"CH","msr_phone":"","msr_override_map_link":"","msr_lab_director":36191,"msr_lab_leadership":[36191],"msr-author-ordering":[],"footnotes":""},"msr-locale":[268875],"class_list":["post-602418","msr-research-lab","type-msr-research-lab","status-publish","hentry","msr-locale-en_us"],"related-researchers":[{"type":"user_nicename","display_name":"Marc Pollefeys","user_id":36191,"people_section":"Microsoft","alias":"mapoll"},{"type":"user_nicename","display_name":"Cedric Cagniart","user_id":38580,"people_section":"Microsoft","alias":"cecagnia"},{"type":"user_nicename","display_name":"Jeffrey Delmerico","user_id":38562,"people_section":"Microsoft","alias":"jedelmer"},{"type":"user_nicename","display_name":"S\u00f2nia Batllori Pallar\u00e8s","user_id":40189,"people_section":"Microsoft","alias":"soniaba"},{"type":"user_nicename","display_name":"Silvano Galliani","user_id":38601,"people_section":"Microsoft","alias":"sigallia"},{"type":"user_nicename","display_name":"Mario Gini","user_id":40540,"people_section":"Microsoft","alias":"mariogini"},{"type":"user_nicename","display_name":"Lukas Gruber","user_id":38565,"people_section":"Microsoft","alias":"lugruber"},{"type":"user_nicename","display_name":"Kate Jaroslavceva","user_id":43722,"people_section":"Microsoft","alias":"jekatrinaj"},{"type":"user_nicename","display_name":"David Marek","user_id":40186,"people_section":"Microsoft","alias":"damarek"},{"type":"user_nicename","display_name":"Ondrej Miksik","user_id":38556,"people_section":"Microsoft","alias":"onmiksik"},{"type":"user_nicename","display_name":"Patrick Misteli","user_id":39120,"people_section":"Microsoft","alias":"pamistel"},{"type":"user_nicename","display_name":"Roy Nejabi","user_id":42276,"people_section":"Microsoft","alias":"rnejabi"},{"type":"user_nicename","display_name":"Mahdi Rad","user_id":41692,"people_section":"Microsoft","alias":"mahdirad"},{"type":"user_nicename","display_name":"Evgeniya Paulinsky","user_id":40879,"people_section":"Microsoft","alias":"eshershikova"},{"type":"user_nicename","display_name":"Christoph Vogel","user_id":38625,"people_section":"Microsoft","alias":"chvogel"},{"type":"user_nicename","display_name":"Rui Wang","user_id":40192,"people_section":"Microsoft","alias":"wangr"},{"type":"user_nicename","display_name":"Krzysztof Waraksa","user_id":42894,"people_section":"Microsoft","alias":"krwaraksa"},{"type":"user_nicename","display_name":"Isar Meijer","user_id":43734,"people_section":"Microsoft","alias":"isarmeijer"},{"type":"user_nicename","display_name":"Simon Janezic","user_id":43737,"people_section":"Microsoft","alias":"simonjanezic"},{"type":"user_nicename","display_name":"Bianca Tazlauanu","user_id":43740,"people_section":"Microsoft","alias":"btazlauanu"},{"type":"user_nicename","display_name":"Remi Pautrat","user_id":43743,"people_section":"Microsoft","alias":"pautratrmi"},{"type":"user_nicename","display_name":"Alessandro Stefanini","user_id":43824,"people_section":"Microsoft","alias":"alstefa"},{"type":"user_nicename","display_name":"Kevin Qu","user_id":43852,"people_section":"Microsoft","alias":"b-kevinqu"},{"type":"user_nicename","display_name":"Gabriele Goletto","user_id":43897,"people_section":"Microsoft","alias":"ggoletto"},{"type":"guest","display_name":"Federica  Bogo","user_id":1108722,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Jan-Willem Buurlage","user_id":1108134,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Juan Nieto","user_id":1108113,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Helen Oleynikova","user_id":727051,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Johannes Sch&ouml;nberger","user_id":1108107,"people_section":"Former Team Members","alias":""},{"type":"user_nicename","display_name":"Michal Sta\u0161a","user_id":42759,"people_section":"Former Team Members","alias":"mistasa"},{"type":"guest","display_name":"Jan Stuehmer","user_id":727054,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Bu\u011fra Tekin","user_id":1108176,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Eric Vollenweider","user_id":708409,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Olga Vysotska","user_id":1108170,"people_section":"Former Team Members","alias":""},{"type":"guest","display_name":"Michael Baumgartner","user_id":786175,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Lukas  Bernreiter","user_id":1108731,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Julia Chen","user_id":786169,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Weirong  Chen","user_id":1108194,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Le Chen","user_id":1108200,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Changan  Chen","user_id":1108203,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Tien Do","user_id":1108755,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Arda  D&uuml;z&ccedil;eker","user_id":1108227,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Gabriela Evrova","user_id":708412,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Marcel  Geppert","user_id":786184,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Beno&icirc;t Guillard","user_id":1108737,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Fabian  G&ouml;bel","user_id":1108764,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Peter Haas","user_id":1108182,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Yana Hasson","user_id":608811,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Xudong  Jiang","user_id":1108221,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Florian  Kennel-Maushart","user_id":1108242,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Sena Kiciroglu","user_id":1108746,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Timon Knigge","user_id":786181,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Taein Kwon","user_id":607503,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Weizhe  Liu","user_id":786172,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Alexander Millane","user_id":708406,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Simone Rossi","user_id":1108209,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Paul-Edouard Sarlin","user_id":786178,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Emilia Szyma\u0144ska","user_id":1108125,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Prune  Truong","user_id":1108188,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Pavel Trutman","user_id":608817,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Fangjinhua  Wang","user_id":1108212,"people_section":"Former PhD students, Interns and Visitors","alias":""},{"type":"guest","display_name":"Chen Zhao","user_id":1108101,"people_section":"Former PhD students, Interns and Visitors","alias":""}],"related-publications":[585760,509486,606297,608844,608856,608865,608877,608889,608904,608913,608931,608940,608949,608958,608967,608976,608985,608994,609003,609012,609021,609153,609156,609183,609195,609201,609216,609225,609231,609237,609252,609261,609414,609423,609834,609843,609864,609873,609882,609891,609900,609909,609918,609927,609936,609945,609951,609957,609969,609975,609984,610017,610026,610035,610044,610053,610062,610074,610083,610092,610101,610107,610119,610152,610176,655743,666360,666399,666549,666567,666594,666672,675435,684156,684195,684444,684450,684459,685404,725479,727504,749701,751756,751819,752395,752440,780877,780907,780946,780964,780979,781081,781207,846721,846754,847498,863139,864111,866286,866646,880944,883350,883359,890160,966630,966648,966657,966666,966681,966690,966699,966708,966717,966726,971430,1021146,1021158,1021167,1021188,1021203,1021218,1021227,1021236,1021290,1034556,1034685,1108776,1108791,1108803,1130628,1162100,1162460,1167830,1168503,1168526],"related-downloads":[],"related-videos":[753655,840679,840688,840703,840733,840748,840826,841633,753421,701518,736075,753364,753379,753388,753397,753412,681063,753430,753439,753451,753460,753469,753478,753487],"related-posts":[591829,577674,687813,701506,885738,972234,1008426,1045767,1102680],"related-events":[835105,816505,726349,716245,633120,637092,621243,610425,605931],"related-groups":[611553],"related-projects":[727042,610503,708502],"related-opportunities":[1168104],"related-articles":[],"tab-content":[{"id":0,"name":"Collaborations","content":"The research projects undertaken by the Microsoft Mixed Reality &amp; AI Lab, EPFL and ETH Zurich is a collaborative research effort to address research challenges in areas related to AI. These projects are undertaken via The Microsoft Swiss Joint Research Center established in 2008.\r\n<h3>2022-2023 Projects<\/h3>\r\n[accordion]\r\n[panel header=\"\u00c9cole Polytechnique F\u00e9d\u00e9rale de Lausanne | EPFL Smart Kitchen: Home-Based Functional Assessment Platform for Neurological Patients\"]\r\n<p style=\"margin-bottom: 0px\"><strong>EPFL PIs:<\/strong> Alexander Mathis, Friedhelm Hummel, Silvestro Micera<\/p>\r\n<strong>Microsoft PIs:<\/strong> Marc Pollefeys\r\n<strong>PhD Student:<\/strong> Haozhe Qi\r\n\r\nDespite many advances in neuroprosthetics and neurorehabilitation, the techniques to measure, to personalize and thus to optimize the functional improvements that patients gain with therapy are limited. Impairments remain to be assessed by standardized functional tests, which fail to capture everyday behaviour and quality of life or allow to be well used for personalization and have to be performed by trained health care professionals in the clinical environment. By leveraging recent advances in motion capture and hardware, we will create novel metrics to evaluate, personalize and improve the dexterity of patients in their everyday life. We will utilize the EPFL Smart Kitchen platform to assess naturalistic behaviour in the kitchen of both healthy subjects, upper-limb amputees and stroke patients filmed from a head mounted camera (Microsoft HoloLens). We will develop a computer vision pipeline that is capable of measuring hand-object interactions in patient\u2019s kitchens. Based on this novel, large-scale dataset collected in patient\u2019s kitchens, we will derive metrics that measure dexterity in the \u201cnatural world,\u201d as well as recovered and compensatory movements due to the pathology\/assistive device. We will also use those data, to assess novel control strategies for neuroprosthetics and design optimal, personalized rehabilitation treatment by leveraging virtual reality\r\n[\/panel]\r\n[panel header=\"ETH Zurich | Collaborative Human-Robot Motion Planning with Mixed Reality\"]\r\n<p style=\"margin-bottom: 0px\"><strong>ETH Zurich PIs:<\/strong>Stelian Coros, Roi Poranne<\/p>\r\n<strong>Microsoft PIs:<\/strong>Jeffrey Delmerico, Juan Nieto, Marc Pollefeys\r\n\r\n<strong>PhD Student:<\/strong>Florian-Kennel-Maushart\r\n\r\nDespite popular depictions in sci-fi movies and TV shows, robots remain limited in their ability to autonomously solve complex tasks. Indeed, even the most advanced commercial robots are only now just starting to navigate man-made environments while performing simple pick-and-place operations. In order to enable complex high-level behaviours, such as the abstract reasoning required to manoeuvre objects in highly constrained environments, we propose to leverage human intelligence and intuition. The challenge here is one of representation and communication. In order to communicate human insights about a problem to a robot, or to communicate a robot\u2019s plans and intent to a human, it is necessary to utilize representations of space, tasks, and movements that are mutually intelligible for both human and robot. This work will focus on the problem of single and multi-robot motion planning with human guidance, where a human assists a team of robots in solving a motion-based task that is beyond the reasoning capabilities of the robot systems. We will exploit the ability of Mixed Reality (MR) technology to communicate spatial concepts between robots and humans, and will focus our research efforts on exploring the representations, optimization techniques, and multi-robot task planning necessary to advance the ability of robots to solve complex tasks with human guidance.\r\n[\/panel]\r\n[panel header=\"ETH Zurich | High-Fidelity MR-Me: Lightweight Capture and Photorealistic Differentiable Rendering of Personalized Neural Animatable Avatars\"]\r\n<p style=\"margin-bottom: 0px\"><strong>ETH Zurich PIs:<\/strong>Otmar Hilliges<\/p>\r\n<strong>Microsoft PIs:<\/strong>Julien Valentin\r\n\r\nDigital capture of human bodies is a rapidly growing research area in computer vision and computer graphics that puts scenarios such as life-like Mixed Reality (MR) virtual-social interactions into reach, albeit not without overcoming several challenging research problems. A core question in this respect is how to faithfully transmit a virtual copy of oneself so that a remote collaborator may perceive the interaction as immersive and engaging. To present a real alternative to face-to-face meetings, future AR\/VR systems will crucially depend on the following two core building blocks:\r\n0. means to capture the 3D geometry and appearance (e.g., texture, lighting) of individuals with consumer-grade infrastructure (e.g., a single RGB-D camera) and with very little time and expertise and\r\n1. means to represent the captured geometry and appearance information in a fashion that is suitable for photorealistic rendering under fine-grained control over the underlying factors such as pose and facial expressions amongst others.\r\nIn this project, we plan to develop novel methods to learn animatable representations of humans from \u2018cheap\u2019 data sources alone. Furthermore, we plan to extend our own recent work on animatable neural implicit surfaces, such that it can represent not only the geometry but also the appearance of subjects in high visual fidelity. Finally, we plan to study techniques to enforce geometric and temporal consistency in such methods to make them suitable for MR and other telepresence downstream applications.\r\n[\/panel]\r\n[panel header=\"ETH Zurich | Mixed Reality for Shared Autonomy\"]\r\n<p style=\"margin-bottom: 0px\"><strong>ETH Zurich PIs:<\/strong>Marco Tognon, Mike Allenspach, Nicholas Lawrence, Roland Siegwart<\/p>\r\n<strong>Microsoft PIs:<\/strong>Jeffrey Delmerico, Juan Nieto, Marc Pollefeys\r\n\r\n[\/panel]\r\n<h3>2019-2021 Projects<\/h3>\r\n[panel header=\"\u00c9cole Polytechnique F\u00e9d\u00e9rale de Lausanne | Hands in Contact for Augmented Reality\"]\r\n<p style=\"margin-bottom: 0px\"><strong>EPFL PIs:<\/strong> Pascal Fua, Mathieu Salzmann, Helge Rhodin<\/p>\r\n<strong>Microsoft PIs:<\/strong> Sudipta Sinha, Marc Pollefeys\r\n\r\nIn recent years, there has been tremendous progress in camera-based 6D object pose, hand pose and human 3D pose estimation. They can now both be done in real time but not yet to the level of accuracy required to properly capture how people interact with each other and with objects, which is a crucial component of modeling the world in which we live. For example, when someone grasps an object, types on a keyboard, or shakes someone else\u2019s hand, the position of their fingers with respect to what they are interacting with must be precisely recovered for the resulting models to be used by AR devices, such as the HoloLens device or consumer-level video see-through AR ones. This remains a challenge, especially given the fact that hands are often severely occluded in the egocentric views that are the norm in AR. We will, therefore, work on accurately capturing the interaction between hands and objects they touch and manipulate. At the heart of it, will be the precise modeling of contact points and the resulting physical forces between interacting hands and objects. This is essential for two reasons. First, objects in contact exert forces on each other; their pose and motion can only be accurately captured and understood if reaction forces at contact points and areas are modeled jointly. Second, touch and touch-force devices, such as keyboards and touch-screens are the most common human-computer interfaces, and by sensing contact and contact forces purely visually, every-day objects could be turned into tangible interfaces, that react as if they were equipped with touch-sensitive electronics. For instance, a soft cushion could become a non-intrusive input device that, unlike virtual mid-air menus, provides natural force feedback. In this talk, I will present some of our preliminary results and discuss our research agenda for the year to come.\r\n[\/panel]\r\n[panel header=\"ETH Zurich | Skilled Assistive-Care Robots through Immersive Mixed-Reality Telemanipulation\"]\r\n<p style=\"margin-bottom: 0px\"><strong>ETH Zurich PIs:<\/strong> Stelian Coros, Roi Poranne<\/p>\r\n<strong>Microsoft PIs:<\/strong>Marc Pollefeys\r\n\r\nWith this project, we aim to accelerate the development of intelligent robots that can assist those in need with a variety of everyday tasks. People suffering from physical impairments, for example, often need help dressing or brushing their own hair. Skilled robotic assistants would allow these persons to live an independent lifestyle. Even such seemingly simple tasks, however, require complex manipulation of physical objects, advanced motion planning capabilities, as well as close interactions with human subjects. We believe the key to robots being able to undertake such societally important functions is learning from demonstration. The fundamental research question is, therefore, how can we enable human operators to seamlessly teach a robot how to perform complex tasks? The answer, we argue, lies in immersive telemanipulation. More specifically, we are inspired by the vision of James Cameron\u2019s Avatar, where humans are endowed with alternative embodiments. In such a setting, the human\u2019s intent must be seamlessly mapped to the motions of a robot as the human operator becomes completely immersed in the environment the robot operates in. To achieve this ambitious vision, many technologies must come together: mixed reality as the medium for robot-human communication, perception and action recognition to detect the intent of both the human operator and the human patient, motion retargeting techniques to map the actions of the human to the robot\u2019s motions, and physics-based models to enable the robot to predict and understand the implications of its actions.\r\n[\/panel]\r\n[panel header=\"ETH Zurich | A Modular Approach for Lifelong Mapping from End-User Data\"]\r\n<p style=\"margin-bottom: 0px\"><strong>ETH Zurich PIs:<\/strong> Roland Siegwart, Cesar Cadena, Juan Nieto<\/p>\r\n<strong>Microsoft PIs:<\/strong> Johannes Sch\u00f6nberger, Marc Pollefeys\r\n\r\nAR\/VR allow new and innovative ways of visualizing information and provide a very intuitive interface for interaction. At their core, they rely only on a camera and inertial measurement unit (IMU) setup or a stereo-vision setup to provide the necessary data, either of which are readily available on most commercial mobile devices. Early adoptions of this technology have already been deployed in the real estate business, sports, gaming, retail, tourism, transportation and many other fields. The current technologies in visual-aided motion estimation and mapping on mobile devices have three main requirements to produce highly accurate 3D metric reconstructions: An accurate spatial and temporal calibration of the sensor suite, a procedure which is typically carried out with the help of external infrastructure, like calibration markers, and by following a set of predefined movements. Well-lit, textured environments and feature-rich, smooth trajectories. The continuous and reliable operation of all sensors involved. This project aims at relaxing these requirements, to enable continuous and robust lifelong mapping on end-user mobile devices. Thus, the specific objectives of this work are: 1. Formalize a modular and adaptable multi-modal sensor fusion framework for online map generation; 2. Improve the robustness of mapping and motion estimation by exploiting high-level semantic features; 3. Develop techniques for automatic detection and execution of sensor calibration in the wild. A modular SLAM (simultaneous localization and mapping) pipeline which is able to exploit all available sensing modalities can overcome the individual limitations of each sensor and increase the overall robustness of the estimation. Such an information-rich map representation allows us to leverage recent advances in semantic scene understanding, providing an abstraction from low-level geometric features - which are fragile to noise, sensing conditions and small changes in the environment - to higher-level semantic features that are robust against these effects. Using this complete map representation, we will explore new ways to detect miscalibrations and sensor failures, so that the SLAM process can be adapted online without the need for explicit user intervention. [\/panel] [\/accordion]\r\n<div style=\"height: 20px\"><\/div>\r\n[row][column class=\"m-col-6-24\"]\r\n<a href=\"https:\/\/www.epfl.ch\/en\/\" target=\"_blank\" rel=\"noopener noreferrer\"><img class=\"aligncenter wp-image-691188\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/09\/EPFL_300x200.jpg\" alt=\"EPFL logo\" width=\"300\" height=\"200\" \/><\/a>\r\n[\/column]\r\n[column class=\"m-col-6-24\"]\r\n<a href=\"https:\/\/www.c4dt.org\/\" target=\"_blank\" rel=\"noopener noreferrer\"><img class=\"aligncenter wp-image-691194\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/09\/C4DT_300x200.jpg\" alt=\"Center for Digital Trust logo\" width=\"300\" height=\"200\" \/><\/a>\r\n[\/column]\r\n[column class=\"m-col-6-24\"]\r\n<a href=\"https:\/\/cyberpeaceinstitute.org\/\" target=\"_blank\" rel=\"noopener noreferrer\"><img class=\"aligncenter wp-image-691197\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/09\/CyberPeaceInstitute_300x200.jpg\" alt=\"CyberPeace Institute logo\" width=\"300\" height=\"200\" \/><\/a>\r\n[\/column]\r\n[column class=\"m-col-6-24\"]\r\n<a href=\"https:\/\/ethz.ch\/en.html\" target=\"_blank\" rel=\"noopener noreferrer\"><img class=\"aligncenter wp-image-691200\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/09\/eth_300x200.jpg\" alt=\"ETH Zurich logo\" width=\"300\" height=\"200\" \/><\/a>\r\n[\/column][\/row]"}],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-lab\/602418","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-lab"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-lab"}],"version-history":[{"count":106,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-lab\/602418\/revisions"}],"predecessor-version":[{"id":1159791,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-lab\/602418\/revisions\/1159791"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=602418"}],"wp:term":[{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=602418"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}