Mixed Reality and Robotics – Tutorial @ ICRA 2020

Mixed Reality and Robotics – Tutorial @ ICRA 2020


Mixed, Augmented, and Virtual Reality offer exciting new frontiers in communication, entertainment, and productivity. A primary feature of Mixed Reality (MR) is the ability to register the digital world with the physical one, opening the door to a wide variety of robotics applications. This capability enables more natural human-robot interaction: instead of a user interfacing with a robot through a computer screen, we envision a future in which the user interacts with a robot in the same environment through MR, to see what it sees, to see its intentions, and seamlessly control it in its own representation of the world.

The purpose of this tutorial is to introduce the audience to both the high-level concepts of Mixed Reality and practically demonstrate how these concepts can be used to interact with a robot through an MR device. We will discuss how various hardware devices (mobile phones, AR/MR/VR headsets, and robots’ on-board sensors) can integrate with cloud services to create a digital representation of the physical world, and how such a representation can be used for co-localization. All participants will get a chance to create an iOS, Android, or Microsoft HoloLens 2 app to control and interact with a small mobile robot.

Workshop Organizers

Marc Pollefeys
Helen Oleynikova
Jeff Delmerico

Please register using the link on the left so we can get an estimate of the number of attendees!


Tentative List of Topics Covered

These are subject to change at any time before the tutorial.

We intend to cover both “big picture” ideas of Mixed Reality and how we envision it will transform how we interact with robots, along with technical details on a few different ways to do cross-device localization to allow any Mixed or Augmented Reality device to share a coordinate frame with a robot, and finally a practical portion where we will introduce a few of the tools that are necessary to create full Mixed Reality experienced with robotics. The tutorial will of course culminate with demos where participants can interact with robots through AR on phones or MR on the Hololens 2.

We will make all sample code and slides available open-source at after the tutorial, so that all attendees will be able to build on the concepts presented here.

  • Mixed Reality as an intuitive bridge between robots and humans
  • MR, AR, VR, a brief overview of differences and sample devices
  • Co-localization with Mixed Reality devices
    • AR-tag-based
    • Vision-based
    • Shared-map-based
  • Azure Spatial Anchors
    • Technical introduction
    • How to use the SDK
  • Writing and deploying phone and Hololens apps
    • Unity
    • ROS# and ROS bridge for interfacing with ROS
    • Azure Spatial Anchors SDK for localization
  • Demos!
    • Control a robot through your phone (we’ll provide the compiled apps, and a few spare test phones)
    • Control a robot through a Hololens 2!

How to Prepare

In case you want to follow along with the tutorial (which is not required in any way, we will have all the resources available to all participants later so that they can try later in the comfort of their own lab), then we recommend installing some of the following:

  • Unity on Windows with Android or iOS support (depending on your phone), and UWP support if you’d like to try it on a Hololens (Unity is also available on Linux, but we haven’t personally had much success with the Android SDKs on Linux within Unity)
  • Ubuntu 18.04 with ROS Melodic if you’d like to interact directly with the robots over ROS (or alternatively native ROS on Windows)
  • An ssh terminal such as putty or PowerShell on Windows or any terminal you like on Linux or OS X with SSH support if you’d like to deploy the software on the robots themselves

In any case, please register on the registration site! This is to help us get an estimate of how many people will attend the workshop.