Making sound for games feel immersive and realistic
Making games and augmented/virtual reality (AR/VR) feel immersive requires realistic-sounding audio, as characters, events, and users move through virtual rooms, environments, and ecosystems. Project Triton is a physics-based audio design system that creates an acoustic rendering that sounds natural, while keeping CPU usage modest and providing designer control. A product of a decade of sustained research into efficient wave propagation, it is the first demonstration that accurate wave acoustics can meet the demanding needs of AAA games like Gears of War.
Project Acoustics is a larger product effort to make wave acoustics widely usable, employing Triton as the acoustics engine, with wave solver and encoder components running in Azure, and using an editor/runtime plugin for the Unity game engine along with integrated audio design tools. If you want to try it out, register to download the designer preview.
Immersive wave effects
Sound is weakened when it diffracts around obstructions.
Sound heard “through the wall” (red) from source direction is incorrect. Doors funnel sounds, which is more realistic.
Occlusion is total reduction in loudness from geometry, involving complex propagation and diffraction (diffracted paths in green).
Left: Close to the source, direct path (green) is loud compared to reflections (orange), resulting in high clarity and low reverberance. Right: Behind the partition, direct path is weakened from diffraction, causing low clarity and high reverberance, conveying that the source is in a different room.
Larger rooms reverberate longer.
Project Triton automatically renders believable environmental effects that transition smoothly as the player moves through the world, illustrated above. These wave effects, such as obstruction and portaling, involve wave diffraction, making them challenging to compute within the tight CPU budget for games. Because of this, audio designers have traditionally built such auditory experiences manually, a tedious and expensive process. Project Triton automates this tedium while retaining designer control. The designer can then modify acoustics for storytelling goals, for example, reduce reverberance to improve speech intelligibility, or increase reverb decay time to make a cave feel spookier. It all fits within ~10 percent of a CPU core.
How it works
Project Triton models how sound waves actually travel through complex 3D spaces, diffracting around corners and through doorways, reverberating in various rooms, responding to each triangle’s material. This computation is extremely expensive, so it is precomputed on static visual 3D geometry on a compute cluster in a “baking” step. The overall pipeline is quite analogous to light baking, moving expensive global propagation computation to a baking step rather than during gameplay where CPU is limited. This data is passed through a proprietary parametric compressor that drastically reduces data size and enables fast lookup and signal processing at runtime, allowing Triton to run even on mobile devices such as the Oculus Go.
This 2017 Game Developers Conference (GDC) talk provides a general introduction to the ideas behind Project Triton. For technical details, consult this SIGGRAPH 2014 paper containing the main ideas, and this 2018 paper extending it to modeling directional audio effects such as portaling.