VROOM: Virtual Robot Overlay for Online Meetings

CHI2020 Late Breaking Work |

Organized by ACM SIGCHI

VROOM overviewTelepresence robots allow remote users to freely explore a space they are not in, and provide a physical embodiment in that space. However, they lack a compelling representation of the remote user in the local space. We present VROOM (Virtual Robot Overlay for Online Meetings), a two-way system for exploring how to improve the social experience of robotic telepresence. For the local user, an augmented-reality (AR) interface shows a life-size avatar of the remote user overlaid on a telepresence robot. For the remote user, a head-mounted virtual-reality (VR) interface presents an immersive 360° view of the local space with mobile autonomy. The VR system tracks the remote user’s head pose and hand movements, which are applied to the avatar. This allows the local user to see the remote’s head direction and hand gestures, and the remote user to identify with the robot as an identifiable embodiment of self.

  • See the video figure at the bottom of this page.
  • This CHI Late Breaking Work paper covers the concept and implementation of the VROOM system. We are writing an evaluation paper comparing standard robotic telepresence to the VROOM system.

Publication Downloads

VROOM: Virtual Robot Overlay for Online Meetings – Open Source Components

March 17, 2021

This repository provides a set of code samples illustrating how the VROOM Cross-Reality (XR) telepresence prototype system was assembled. We hope that it will help other researchers prototype similar XR telepresence experiences.

Download Data

VROOM: Virtual Robot Overlay for Online Meetings – Video Figure

Telepresence robots allow remote users to freely explore a space they are not in, and provide a physical embodiment in that space. However, they lack a compelling representation of the remote user in the local space. We present VROOM (Virtual Robot Overlay for Online Meetings), a two-way system for exploring how to improve the social experience of robotic telepresence. For the local user, an augmented-reality (AR) interface shows a life-size avatar of the remote user overlaid on a telepresence robot. For the remote user, a head-mounted virtual-reality (VR) interface presents an immersive 360° view of the local space with mobile autonomy. The VR system tracks the remote user’s head pose and hand movements, which are applied to the avatar. This allows the local user to see the remote’s head direction and hand gestures, and the remote user to identify with the robot as an identifiable embodiment of self. We thank our Microsoft colleagues who helped us tremendously in developing the VROOM prototype: James Scott, Xu Cao, He Huang, Minnie Liu, Zhao Jun, Matthew Gan, and Leon Lu.