{"id":633120,"date":"2020-02-02T08:18:44","date_gmt":"2020-02-02T16:18:44","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-event&#038;p=633120"},"modified":"2025-03-03T08:55:31","modified_gmt":"2025-03-03T16:55:31","slug":"mixed-reality-and-robotics-tutorial-iros-2020","status":"publish","type":"msr-event","link":"https:\/\/www.microsoft.com\/en-us\/research\/event\/mixed-reality-and-robotics-tutorial-iros-2020\/","title":{"rendered":"Mixed Reality and Robotics &#8211; Tutorial @ IROS 2020"},"content":{"rendered":"\n\n\n\n\n<p><strong>This tutorial relied on Microsoft\u2019s Azure Spatial Anchors (ASA) service, which has been retired (see&nbsp;<a id=\"LPlnk115674\" class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" title=\"Original URL: https:\/\/azure.microsoft.com\/en-us\/updates\/azure-spatial-anchors-retirement\/. Click or tap if you trust this link.\" href=\"https:\/\/azure.microsoft.com\/en-us\/updates?id=azure-spatial-anchors-retirement\" target=\"_blank\" rel=\"noopener\" data-auth=\"NotApplicable\" data-linkindex=\"1\">announcement<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>). This page will remain up as a record, but the demos associated with this tutorial will no longer work.<\/strong><\/p>\n\n\n\n<p><strong>Conference:<\/strong> <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/index.html\" target=\"_blank\" rel=\"noopener noreferrer\">IROS 2020<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/p>\n\n\n\n<p><strong>Contact:<\/strong><br><a href=\"mailto:jeffrey.delmerico@microsoft.com\">jeffrey.delmerico@microsoft.com<\/a> and <a href=\"mailto:helen.oleynikova@microsoft.com\">helen.oleynikova@microsoft.com<\/a><\/p>\n\n\n\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\n\n\n<p>Welcome to the Mixed Reality and Robotics Tutorial at <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/index.html\" target=\"_blank\" rel=\"noopener\">IROS 2020<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. This year\u2019s conference is using an on-demand, virtual format, which means that all of the content for this tutorial is available as streaming videos, with code samples to accompany the demos. However, the conference organizers have made registration FREE, so you can gain access to all of the talks and papers, as well as the workshops and tutorials (including this one). Please see the Agenda tab for more detailed information about the tutorial contents.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"registration\">Registration<\/h3>\n\n\n\n<p>In order to view the tutorial videos, you will need to be registered for the IROS conference. However, to help us better understand the research interests of the audience, and to more easily contact IROS attendees who are interested in Mixed Reality, we would kindly ask that you click the link in the top left to register for this event. Registration for our tutorial is not binding, and is separate from the IROS conference registration.&nbsp;In order to access the content for this tutorial through the IROS On-Demand site, you will still need to register for the IROS conference.<strong><br><\/strong><\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"abstract\">Abstract<\/h3>\n\n\n\n<p>Mixed, Augmented, and Virtual Reality offer exciting new frontiers in communication, entertainment, and productivity. A primary feature of Mixed Reality (MR) is the ability to register the digital world with the physical one, opening the door to a wide variety of robotics applications. This capability enables more natural human-robot interaction: instead of a user interfacing with a robot through a computer screen, we envision a future in which the user interacts with a robot in the same environment through MR, to see what it sees, to see its intentions, and seamlessly control it in its own representation of the world.<\/p>\n\n\n\n<p>The purpose of this tutorial is to introduce the audience to both the high-level concepts of Mixed Reality and practically demonstrate how these concepts can be used to interact with a robot through an MR device. We will discuss how various hardware devices (mobile phones, AR\/MR\/VR headsets, and robots\u2019 on-board sensors) can integrate with cloud services to create a digital representation of the physical world, and how such a representation can be used for co-localization. Participants will have a chance to create an iOS, Android, or Microsoft HoloLens 2 app to control and interact with a virtual robot, with instructions on how to adapt the sample code to a real robot, so attendees can start using Mixed Reality in their own robotics projects.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"workshop-organizers\">Workshop Organizers<\/h4>\n\n\n\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/mapoll\/\">Marc Pollefeys<\/a><br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/eloleyni\/\">Helen Oleynikova<\/a><br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/jedelmer\/\">Jeff Delmerico<\/a><\/p>\n\n\n\n<div style=\"height:25px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n\n\n<h2 class=\"wp-block-heading\" id=\"tutorial-contents\">Tutorial Contents<\/h2>\n\n\n\n<p>We cover \u201cbig picture\u201d ideas of Mixed Reality and how we envision that it will transform how we interact with robots, along with technical details on a few different ways to do colocalization to allow any Mixed or Augmented Reality device to share a coordinate frame with a robot.&nbsp;Finally, there is a practical portion where we introduce a few of the tools that are necessary to create full Mixed Reality experiences with robotics. This takes the form of several demos that attendees will be able to build and run on their own, and adapt to use with their own robots.<\/p>\n\n\n\n<p>The tutorial features five videos on the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/ondemand\/index.html\" target=\"_blank\" rel=\"noopener\">IROS 2020 streaming site<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Introduction to MR and Robotics<\/strong><\/li>\n\n\n\n<li><strong>Interaction<\/strong>\n<ul class=\"wp-block-list\">\n<li>Mixed Reality as an intuitive bridge between robots and humans<\/li>\n\n\n\n<li>MR, AR, VR, a brief overview of differences and sample devices<\/li>\n\n\n\n<li>Modes of Interaction in MR<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Colocalization<\/strong>\n<ul class=\"wp-block-list\">\n<li>Co-localization with Mixed Reality devices\n<ul class=\"wp-block-list\">\n<li>AR-tag-based<\/li>\n\n\n\n<li>Vision-based<\/li>\n\n\n\n<li>Shared-map-based<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li>Azure Spatial Anchors\n<ul class=\"wp-block-list\">\n<li>Technical introduction<\/li>\n\n\n\n<li>How to use ASA to colocalize different devices<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Demo 1: Interaction <\/strong>[<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\" target=\"_blank\" rel=\"noopener noreferrer\">Source Code<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>]\n<ul class=\"wp-block-list\">\n<li>Writing and deploying phone and Hololens apps\n<ul class=\"wp-block-list\">\n<li>Unity<\/li>\n\n\n\n<li>ROS# and ROS bridge for interfacing with ROS<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li>Interacting with a virtual robot through AR and MR<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Demo 2: Colocalization<\/strong>\n<ul class=\"wp-block-list\">\n<li>Azure Spatial Anchors SDK for localization of robots and MR devices<\/li>\n\n\n\n<li>Creating and querying spatial anchors using sample data<\/li>\n\n\n\n<li>How to use this code with your own camera<\/li>\n<\/ul>\n<\/li>\n<\/ol>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"demo-materials\">Demo Materials<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"demo-1-interaction\">Demo 1 \u2013 Interaction<\/h3>\n\n\n\n<p>Sample code for the exercises in this demo can be found here: <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\" target=\"_blank\" rel=\"noopener noreferrer\">https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/p>\n\n\n\n<p>This repo contains an extensive <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\/wiki\" target=\"_blank\" rel=\"noopener noreferrer\">wiki<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> with instructions on how to run the demo with pre-built apps and docker containers, how to set up your system to develop and deploy MR apps, and how to adapt the sample code to your own robot.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/iros-2020-mixed-reality-and-robotics-tutorial-demo-1-interaction\/\">Watch recording<\/a><\/div>\n<\/div>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"demo-2-colocalization\">Demo 2 \u2013 Colocalization<\/h3>\n\n\n\n<p>This demo relied on Microsoft&#8217;s Azure Spatial Anchors (ASA) service, which has been retired (see <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/azure.microsoft.com\/en-us\/updates?id=azure-spatial-anchors-retirement\" target=\"_blank\" rel=\"noopener noreferrer\">announcement<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>). All code and instructions associated with this demo have also been taken down.<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/azure_spatial_anchors_ros\" target=\"_blank\" rel=\"noopener noreferrer\"><span class=\"sr-only\">(opens in new tab)<\/span><\/a><\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/iros-2020-mixed-reality-and-robotics-tutorial-demo-2-co-localization\/\">Watch recording<\/a><\/div>\n<\/div>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"conclusion\">Conclusion<\/h2>\n\n\n\n<p>We hope that this information and these tools help you to incorporate Mixed Reality into your robotics projects, for colocalization and\/or human-robot interaction.&nbsp;We would like to encourage you to send us feedback on your experience with the tutorial.&nbsp;Please engage with us on GitHub by filing issues (for questions or problems not covered in the wikis) or contributing to the two repositories.<\/p>\n\n\n\n<div style=\"height:25px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n","protected":false},"excerpt":{"rendered":"<p>The Mixed Reality and Robotics tutorial at IROS 2020 will introduce the audience to both the high-level concepts of Mixed Reality and practically demonstrate how these concepts can be used to interact with a robot through an MR device.<\/p>\n","protected":false},"featured_media":661089,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_startdate":"2020-10-25","msr_enddate":"2020-10-25","msr_location":"On-demand Virtual","msr_expirationdate":"","msr_event_recording_link":"","msr_event_link":"","msr_event_link_redirect":false,"msr_event_time":"","msr_hide_region":true,"msr_private_event":false,"msr_hide_image_in_river":null,"footnotes":""},"research-area":[13562],"msr-region":[239178],"msr-event-type":[197941],"msr-video-type":[],"msr-locale":[268875],"msr-program-audience":[],"msr-post-option":[],"msr-impact-theme":[],"class_list":["post-633120","msr-event","type-msr-event","status-publish","has-post-thumbnail","hentry","msr-research-area-computer-vision","msr-region-europe","msr-event-type-conferences","msr-locale-en_us"],"msr_about":"<!-- wp:msr\/event-details {\"title\":\"Mixed Reality and Robotics - Tutorial @ IROS 2020\",\"image\":{\"id\":661089,\"url\":\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720.jpg\",\"alt\":\"\"}} \/-->\n\n<!-- wp:msr\/content-tabs -->\n<!-- wp:msr\/content-tab {\"title\":\"About\"} -->\n<!-- wp:paragraph -->\n<p><strong>This tutorial relied on Microsoft\u2019s Azure Spatial Anchors (ASA) service, which has been retired (see&nbsp;<a id=\"LPlnk115674\" class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" title=\"Original URL: https:\/\/azure.microsoft.com\/en-us\/updates\/azure-spatial-anchors-retirement\/. Click or tap if you trust this link.\" href=\"https:\/\/azure.microsoft.com\/en-us\/updates?id=azure-spatial-anchors-retirement\" target=\"_blank\" rel=\"noopener\" data-auth=\"NotApplicable\" data-linkindex=\"1\">announcement<\/a>). This page will remain up as a record, but the demos associated with this tutorial will no longer work.<\/strong><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p><strong>Conference:<\/strong> <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/index.html\" target=\"_blank\" rel=\"noopener noreferrer\">IROS 2020<\/a><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p><strong>Contact:<\/strong><br><a href=\"mailto:jeffrey.delmerico@microsoft.com\">jeffrey.delmerico@microsoft.com<\/a> and <a href=\"mailto:helen.oleynikova@microsoft.com\">helen.oleynikova@microsoft.com<\/a><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Welcome to the Mixed Reality and Robotics Tutorial at <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/index.html\" target=\"_blank\" rel=\"noopener\">IROS 2020<\/a>. This year\u2019s conference is using an on-demand, virtual format, which means that all of the content for this tutorial is available as streaming videos, with code samples to accompany the demos. However, the conference organizers have made registration FREE, so you can gain access to all of the talks and papers, as well as the workshops and tutorials (including this one). Please see the Agenda tab for more detailed information about the tutorial contents.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:heading {\"level\":3} -->\n<h3 class=\"wp-block-heading\" id=\"registration\">Registration<\/h3>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p>In order to view the tutorial videos, you will need to be registered for the IROS conference. However, to help us better understand the research interests of the audience, and to more easily contact IROS attendees who are interested in Mixed Reality, we would kindly ask that you click the link in the top left to register for this event. Registration for our tutorial is not binding, and is separate from the IROS conference registration.&nbsp;In order to access the content for this tutorial through the IROS On-Demand site, you will still need to register for the IROS conference.<strong><br><\/strong><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:heading {\"level\":3} -->\n<h3 class=\"wp-block-heading\" id=\"abstract\">Abstract<\/h3>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p>Mixed, Augmented, and Virtual Reality offer exciting new frontiers in communication, entertainment, and productivity. A primary feature of Mixed Reality (MR) is the ability to register the digital world with the physical one, opening the door to a wide variety of robotics applications. This capability enables more natural human-robot interaction: instead of a user interfacing with a robot through a computer screen, we envision a future in which the user interacts with a robot in the same environment through MR, to see what it sees, to see its intentions, and seamlessly control it in its own representation of the world.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>The purpose of this tutorial is to introduce the audience to both the high-level concepts of Mixed Reality and practically demonstrate how these concepts can be used to interact with a robot through an MR device. We will discuss how various hardware devices (mobile phones, AR\/MR\/VR headsets, and robots\u2019 on-board sensors) can integrate with cloud services to create a digital representation of the physical world, and how such a representation can be used for co-localization. Participants will have a chance to create an iOS, Android, or Microsoft HoloLens 2 app to control and interact with a virtual robot, with instructions on how to adapt the sample code to a real robot, so attendees can start using Mixed Reality in their own robotics projects.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:heading {\"level\":4} -->\n<h4 class=\"wp-block-heading\" id=\"workshop-organizers\">Workshop Organizers<\/h4>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/mapoll\/\">Marc Pollefeys<\/a><br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/eloleyni\/\">Helen Oleynikova<\/a><br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/jedelmer\/\">Jeff Delmerico<\/a><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:spacer {\"height\":\"25px\"} -->\n<div style=\"height:25px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n<!-- \/wp:msr\/content-tab -->\n\n<!-- wp:msr\/content-tab {\"title\":\"Agenda\"} -->\n<!-- wp:heading -->\n<h2 class=\"wp-block-heading\" id=\"tutorial-contents\">Tutorial Contents<\/h2>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p>We cover \u201cbig picture\u201d ideas of Mixed Reality and how we envision that it will transform how we interact with robots, along with technical details on a few different ways to do colocalization to allow any Mixed or Augmented Reality device to share a coordinate frame with a robot.&nbsp;Finally, there is a practical portion where we introduce a few of the tools that are necessary to create full Mixed Reality experiences with robotics. This takes the form of several demos that attendees will be able to build and run on their own, and adapt to use with their own robots.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>The tutorial features five videos on the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/ondemand\/index.html\" target=\"_blank\" rel=\"noopener\">IROS 2020 streaming site<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>:<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:list {\"ordered\":true} -->\n<ol class=\"wp-block-list\"><!-- wp:list-item -->\n<li><strong>Introduction to MR and Robotics<\/strong><\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li><strong>Interaction<\/strong><!-- wp:list -->\n<ul class=\"wp-block-list\"><!-- wp:list-item -->\n<li>Mixed Reality as an intuitive bridge between robots and humans<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>MR, AR, VR, a brief overview of differences and sample devices<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Modes of Interaction in MR<\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list --><\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li><strong>Colocalization<\/strong><!-- wp:list -->\n<ul class=\"wp-block-list\"><!-- wp:list-item -->\n<li>Co-localization with Mixed Reality devices<!-- wp:list -->\n<ul class=\"wp-block-list\"><!-- wp:list-item -->\n<li>AR-tag-based<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Vision-based<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Shared-map-based<\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list --><\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Azure Spatial Anchors<!-- wp:list -->\n<ul class=\"wp-block-list\"><!-- wp:list-item -->\n<li>Technical introduction<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>How to use ASA to colocalize different devices<\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list --><\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list --><\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li><strong>Demo 1: Interaction <\/strong>[<a href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\" target=\"_blank\" rel=\"noreferrer noopener\">Source Code<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>]<!-- wp:list -->\n<ul class=\"wp-block-list\"><!-- wp:list-item -->\n<li>Writing and deploying phone and Hololens apps<!-- wp:list -->\n<ul class=\"wp-block-list\"><!-- wp:list-item -->\n<li>Unity<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>ROS# and ROS bridge for interfacing with ROS<\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list --><\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Interacting with a virtual robot through AR and MR<\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list --><\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li><strong>Demo 2: Colocalization<\/strong><!-- wp:list -->\n<ul class=\"wp-block-list\"><!-- wp:list-item -->\n<li>Azure Spatial Anchors SDK for localization of robots and MR devices<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Creating and querying spatial anchors using sample data<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>How to use this code with your own camera<\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list --><\/li>\n<!-- \/wp:list-item --><\/ol>\n<!-- \/wp:list -->\n\n<!-- wp:heading -->\n<h2 class=\"wp-block-heading\" id=\"demo-materials\">Demo Materials<\/h2>\n<!-- \/wp:heading -->\n\n<!-- wp:heading {\"level\":3} -->\n<h3 class=\"wp-block-heading\" id=\"demo-1-interaction\">Demo 1 \u2013 Interaction<\/h3>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p>Sample code for the exercises in this demo can be found here: <a href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>This repo contains an extensive <a href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\/wiki\" target=\"_blank\" rel=\"noreferrer noopener\">wiki<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> with instructions on how to run the demo with pre-built apps and docker containers, how to set up your system to develop and deploy MR apps, and how to adapt the sample code to your own robot.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:buttons -->\n<div class=\"wp-block-buttons\"><!-- wp:button -->\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/iros-2020-mixed-reality-and-robotics-tutorial-demo-1-interaction\/\">Watch recording<\/a><\/div>\n<!-- \/wp:button --><\/div>\n<!-- \/wp:buttons -->\n\n<!-- wp:heading {\"level\":3} -->\n<h3 class=\"wp-block-heading\" id=\"demo-2-colocalization\">Demo 2 \u2013 Colocalization<\/h3>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p>This demo relied on Microsoft's Azure Spatial Anchors (ASA) service, which has been retired (see <a href=\"https:\/\/azure.microsoft.com\/en-us\/updates?id=azure-spatial-anchors-retirement\" target=\"_blank\" rel=\"noreferrer noopener\">announcement<\/a>). All code and instructions associated with this demo have also been taken down.<a href=\"https:\/\/github.com\/microsoft\/azure_spatial_anchors_ros\" target=\"_blank\" rel=\"noreferrer noopener\"><span class=\"sr-only\">(opens in new tab)<\/span><\/a><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:buttons -->\n<div class=\"wp-block-buttons\"><!-- wp:button -->\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/iros-2020-mixed-reality-and-robotics-tutorial-demo-2-co-localization\/\">Watch recording<\/a><\/div>\n<!-- \/wp:button --><\/div>\n<!-- \/wp:buttons -->\n\n<!-- wp:heading -->\n<h2 class=\"wp-block-heading\" id=\"conclusion\">Conclusion<\/h2>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p>We hope that this information and these tools help you to incorporate Mixed Reality into your robotics projects, for colocalization and\/or human-robot interaction.&nbsp;We would like to encourage you to send us feedback on your experience with the tutorial.&nbsp;Please engage with us on GitHub by filing issues (for questions or problems not covered in the wikis) or contributing to the two repositories.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:spacer {\"height\":\"25px\"} -->\n<div style=\"height:25px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n<!-- \/wp:msr\/content-tab -->\n<!-- \/wp:msr\/content-tabs -->","tab-content":[{"id":0,"name":"About","content":"Welcome to the Mixed Reality and Robotics Tutorial at <a href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/index.html\" target=\"_blank\" rel=\"noopener\">IROS 2020<\/a>. This year's conference is using an on-demand, virtual format, which means that all of the content for this tutorial is available as streaming videos, with code samples to accompany the demos. However, the conference organizers have made registration FREE, so you can gain access to all of the talks and papers, as well as the workshops and tutorials (including this one). Please see the Agenda tab for more detailed information about the tutorial contents.\r\n<h3>Registration<\/h3>\r\nIn order to view the tutorial videos, you will need to be registered for the IROS conference. However, to help us better understand the research interests of the audience, and to more easily contact IROS attendees who are interested in Mixed Reality, we would kindly ask that you click the link in the top left to register for this event. Registration for our tutorial is not binding, and is separate from the IROS conference registration.\u00a0In order to access the content for this tutorial through the IROS On-Demand site, you will still need to register for the IROS conference.<strong>\r\n<\/strong>\r\n<h3>Abstract<\/h3>\r\nMixed, Augmented, and Virtual Reality offer exciting new frontiers in communication, entertainment, and productivity. A primary feature of Mixed Reality (MR) is the ability to register the digital world with the physical one, opening the door to a wide variety of robotics applications. This capability enables more natural human-robot interaction: instead of a user interfacing with a robot through a computer screen, we envision a future in which the user interacts with a robot in the same environment through MR, to see what it sees, to see its intentions, and seamlessly control it in its own representation of the world.\r\n\r\nThe purpose of this tutorial is to introduce the audience to both the high-level concepts of Mixed Reality and practically demonstrate how these concepts can be used to interact with a robot through an MR device. We will discuss how various hardware devices (mobile phones, AR\/MR\/VR headsets, and robots\u2019 on-board sensors) can integrate with cloud services to create a digital representation of the physical world, and how such a representation can be used for co-localization. Participants will have a chance to create an iOS, Android, or Microsoft HoloLens 2 app to control and interact with a virtual robot, with instructions on how to adapt the sample code to a real robot, so attendees can start using Mixed Reality in their own robotics projects.\r\n<h4>Workshop Organizers<\/h4>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/mapoll\/\">Marc Pollefeys<\/a>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/eloleyni\/\">Helen Oleynikova<\/a>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/jedelmer\/\">Jeff Delmerico<\/a>"},{"id":1,"name":"Agenda","content":"<h2>Tutorial Contents<\/h2>\r\nWe cover \"big picture\" ideas of Mixed Reality and how we envision that it will transform how we interact with robots, along with technical details on a few different ways to do colocalization to allow any Mixed or Augmented Reality device to share a coordinate frame with a robot.\u00a0 \u00a0Finally, there is a practical portion where we introduce a few of the tools that are necessary to create full Mixed Reality experiences with robotics. This takes the form of several demos that attendees will be able to build and run on their own, and adapt to use with their own robots.\r\n\r\nThe tutorial features five videos on the <a href=\"https:\/\/ewh.ieee.org\/soc\/ras\/conf\/financiallycosponsored\/IROS\/IROS2020\/www.iros2020.org\/ondemand\/index.html\" target=\"_blank\" rel=\"noopener\">IROS 2020 streaming site<\/a>:\r\n<ol>\r\n \t<li><strong>Introduction to MR and Robotics<\/strong><\/li>\r\n \t<li><strong>Interaction<\/strong>\r\n<ul>\r\n \t<li>Mixed Reality as an intuitive bridge between robots and humans<\/li>\r\n \t<li>MR, AR, VR, a brief overview of differences and sample devices<\/li>\r\n \t<li>Modes of Interaction in MR<\/li>\r\n<\/ul>\r\n<\/li>\r\n \t<li><strong>Colocalization<\/strong>\r\n<ul>\r\n \t<li>Co-localization with Mixed Reality devices\r\n<ul>\r\n \t<li>AR-tag-based<\/li>\r\n \t<li>Vision-based<\/li>\r\n \t<li>Shared-map-based<\/li>\r\n<\/ul>\r\n<\/li>\r\n \t<li>Azure Spatial Anchors\r\n<ul>\r\n \t<li>Technical introduction<\/li>\r\n \t<li>How to use ASA to colocalize different devices<\/li>\r\n<\/ul>\r\n<\/li>\r\n<\/ul>\r\n<\/li>\r\n \t<li><strong>Demo 1: Interaction <\/strong>[<a href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\">Source Code<\/a>]\r\n<ul>\r\n \t<li>Writing and deploying phone and Hololens apps\r\n<ul>\r\n \t<li>Unity<\/li>\r\n \t<li>ROS# and ROS bridge for interfacing with ROS<\/li>\r\n<\/ul>\r\n<\/li>\r\n \t<li>Interacting with a virtual robot through AR and MR<\/li>\r\n<\/ul>\r\n<\/li>\r\n \t<li><strong>Demo 2: Colocalization<\/strong>\r\n<ul>\r\n \t<li>Azure Spatial Anchors SDK for localization of robots and MR devices<\/li>\r\n \t<li>Creating and querying spatial anchors using sample data<\/li>\r\n \t<li>How to use this code with your own camera<\/li>\r\n<\/ul>\r\n<\/li>\r\n<\/ol>\r\n<h2>Demo Materials<\/h2>\r\n<h3><strong>Demo 1 - Interaction<\/strong><\/h3>\r\nSample code for the exercises in this demo can be found here: <a href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\">https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo<\/a>\r\n\r\nThis repo contains an extensive <a href=\"https:\/\/github.com\/microsoft\/mixed-reality-robot-interaction-demo\/wiki\">wiki<\/a> with instructions on how to run the demo with pre-built apps and docker containers, how to set up your system to develop and deploy MR apps, and how to adapt the sample code to your own robot.\r\n<h3>Demo 2 - Colocalization<\/h3>\r\nThis demo focuses on a special <em>research-only<\/em> software package: the Azure Spatial Anchors Linux SDK.\u00a0 Instructions for obtaining the closed-source binaries and open-source ROS wrapper can be found at the wrapper's github page: <a href=\"https:\/\/github.com\/microsoft\/azure_spatial_anchors_ros\">https:\/\/github.com\/microsoft\/azure_spatial_anchors_ros<\/a>\r\n\r\nThe <a href=\"https:\/\/github.com\/microsoft\/azure_spatial_anchors_ros\/wiki\">wiki<\/a> in this repo contains instructions for running the demo using sample datasets, an overview of the structure of the ASA interface and features of the ROS node, as well as some tips for using ASA from a live camera.\r\n<h2>Conclusion<\/h2>\r\nWe hope that this information and these tools help you to incorporate Mixed Reality into your robotics projects, for colocalization and\/or human-robot interaction.\u00a0 We would like to encourage you to send us feedback on your experience with the tutorial.\u00a0 Please engage with us on github by filing issues (for questions or problems not covered in the wikis) or contributing to the two repositories."}],"msr_startdate":"2020-10-25","msr_enddate":"2020-10-25","msr_event_time":"","msr_location":"On-demand Virtual","msr_event_link":"","msr_event_recording_link":"","msr_startdate_formatted":"October 25, 2020","msr_register_text":"Watch now","msr_cta_link":"","msr_cta_text":"","msr_cta_bi_name":"","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720-960x540.jpg\" class=\"img-object-cover\" alt=\"photo of the Las Vegas Strip at night\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720-343x193.jpg 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/Mixed-Reality-Robotics-Workshop-Vegas2020_1920x720-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","event_excerpt":"The Mixed Reality and Robotics tutorial at IROS 2020 will introduce the audience to both the high-level concepts of Mixed Reality and practically demonstrate how these concepts can be used to interact with a robot through an MR device.","msr_research_lab":[602418],"related-researchers":[{"type":"user_nicename","display_name":"Jeffrey Delmerico","user_id":38562,"people_section":"Section name 0","alias":"jedelmer"},{"type":"user_nicename","display_name":"Marc Pollefeys","user_id":36191,"people_section":"Section name 0","alias":"mapoll"}],"msr_impact_theme":[],"related-academic-programs":[],"related-groups":[],"related-projects":[727042],"related-opportunities":[],"related-publications":[],"related-videos":[729874,729901],"related-posts":[701506],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/633120","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-event"}],"version-history":[{"count":42,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/633120\/revisions"}],"predecessor-version":[{"id":1133349,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/633120\/revisions\/1133349"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/661089"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=633120"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=633120"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=633120"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=633120"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=633120"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=633120"},{"taxonomy":"msr-program-audience","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-program-audience?post=633120"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=633120"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=633120"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}