Parametric Wave Field Coding for Precomputed Sound Propagation

ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH 2014 | , Vol 33

DOI

The acoustic wave field in a complex scene is a chaotic 7D function of time and the positions of source and listener, making it difficult to compress and interpolate. This hampers precomputed approaches which tabulate impulse responses (IRs) to allow immersive, real-time sound propagation in static scenes. We code the field of time-varying IRs in terms of a few perceptual parameters derived from the IR’s energy decay. The resulting parameter fields are spatially smooth and compressed using a lossless scheme similar to PNG. We show that this encoding removes two of the seven dimensions, making it possible to handle large scenes such as entire game maps within 100MB of memory. Run-time decoding is fast, taking 100 microseconds per source. We introduce an efficient and scalable method for convolutionally rendering acoustic parameters that generates artifact-free audio even for fast motion and sudden changes in reverberance. We demonstrate convincing spatially-varying effects in complex scenes including occlusion/obstruction and reverberation, in our system integrated with Unreal Engine(TM).

Accompanying presentation material can be found here.

Parametric Wave Field Coding for Precomputed Sound Propagation

The acoustic wave field in a complex scene is a chaotic 7D function of time and the positions of source and listener, making it difficult to compress and interpolate. This hampers precomputed approaches which tabulate impulse responses (IRs) to allow immersive, real-time sound propagation in static scenes. We code the field of time-varying IRs in terms of a few perceptual parameters derived from the IR’s energy decay. The resulting parameter fields are spatially smooth and compressed using a lossless scheme similar to PNG. We show that this encoding removes two of the seven dimensions, making it possible to handle large scenes such as entire game maps within 100MB of memory. Run-time decoding is fast, taking 100 microseconds per source. We introduce an efficient and scalable method for convolutionally rendering acoustic parameters that generates artifact-free audio even for fast motion and sudden changes in reverberance. We demonstrate convincing spatially-varying effects in complex scenes including occlusion/obstruction and reverberation, in our system integrated with Unreal Engine (TM).

Interactive sound simulation: Rendering immersive soundscapes in games and virtual reality

The audio-visual immersion of game engines and virtual reality/mixed reality has a vast range of applications, from entertainment to productivity. Physical simulation is required in these applications to produce nuanced, believable renderings that respond fluidly to unpredictable user interaction. Simulating sound phenomena synchronized with visuals must be done within tight real-time constraints. The wave behavior of audible sound is quite different from visible light, requiring fundamentally distinct techniques. The resulting challenges have impeded practical adoption in the past, but these barriers are finally being overcome, with accelerating usage of advanced sound technologies in interactive applications today. In this webinar led by Microsoft Principal Researcher Dr. Nikunj Raghuvanshi, learn the ins and outs of creating practical, high-quality sound simulations. You will get an overview of the three components of sound simulation: synthesis, propagation, and spatialization. For each, we will review the underlying physics, research techniques, practical considerations, and open research questions. Special focus will be on Project Triton: a sound propagation technology being designed at Microsoft Research that ships in major games and virtual reality experiences today. Together, we will explore: Challenges and requirements of practical interactive simulation Fundamentals of the physics of sound generation, propagation, and perception Overview of research in sound simulation and future research directions Detailed discussion of Project Triton for fast sound wave propagation Project Acoustics for plugin integration of Triton and HRTF spatialization in your projects Various audio-visual demos of immersive sound simulation Resource list: Project Triton Download and docs on Project Acoustics toolset Project Acoustics Github forums Parametric Wave Field Coding for Precomputed Sound Propagation (Publication) Parametric Directional Coding for Precomputed Sound Propagation (Publication) Incorporating directional sources into interactive sound propagation (Publication) Project Triton and the physics of sound with Dr. Nikunj Raghuvanshi (Podcast) Nikunj Raghuvanshi at Microsoft *This on-demand webinar features a previously recorded Q&A session and open captioning. Explore more Microsoft Research webinars: https://aka.ms/msrwebinars