Cloud-Enabled Interactive Sound Propagation for Untethered Mixed Reality

AES International Conference on Audio for Virtual and Augmented Reality (AVAR) |

Organized by Audio Engineering Society

Presentation (ppt)

We describe the first system for physically-based wave acoustics including diffraction effects within a holographic experience shared by multiple untethered devices. Our system scales across standalone mobile-class devices, from a HoloLens to a modern smart phone. Audio propagation in real-world scenes exhibits perceptually salient effects that complement visuals. These include diffraction losses from obstruction, re-direction (“portaling”) of sounds around physical doorways and corners, and reverberation in complex geometries with multiple connected spaces. Such effects are necessary in mixed reality to achieve a sense of presence for virtual people and things within the real world, but have so far been computationally infeasible on mobile devices. We propose a novel cloud-enabled system that enables such immersive audio-visual scenarios on untethered mixed reality devices for the first time.

Talk and Demo can be viewed here (opens in new tab)

System Architecture

We propose a novel system architecture where an “acoustic map” in the cloud stores acoustic simulation data that are bound to various locations in the world (such as via Azure Spatial Anchors).