Enabling interaction between mixed reality and robots via cloud-based localization

Date

October 28, 2020

Speaker

Marc Pollefeys, Juan Nieto, Helen Oleynikova, Jeff Delmerico

Affiliation

Microsoft Mixed Reality and AI Lab Zurich

Overview

We introduce a way to enable more natural interaction between humans and robots through Mixed Reality, by using a shared coordinate system. Azure Spatial Anchors, which already supports colocalizing multiple HoloLens and smartphone devices in the same space, has now been extended to support robots equipped with cameras.

This allows humans and robots sharing the same space to interact naturally: humans can see the plan and intention of the robot, while the robot can interpret commands given from the person’s perspective. We hope that this can be a building block in the future of humans and robots being collaborators and coworkers.

Check out the code at aka.ms/ASALinuxSDK.

People