Project Triton

Project Triton

Publications

Groups

Microsoft Research blog

Overview

Making sound for games feel immersive and realistic

Making games and augmented/virtual reality (AR/VR) feel immersive requires realistic-sounding audio, as characters, events, and users move through virtual rooms, environments, and ecosystems. Project Triton is a physics-based audio design system that creates an acoustic rendering that sounds natural, while keeping CPU usage modest and providing designer control. A product of a decade of sustained research into efficient wave propagation, it is the first demonstration that accurate wave acoustics can meet the demanding needs of AAA games like Gears of War.

Project Acoustics is the larger product effort that makes the internal Triton technology available externally for any game, via Unity and Unreal game engine plugins, with offline computation performed in Azure. Learn more and download.

Immersive wave effects

Obstruction

Project Triton sound obstruction

Sound is weakened when it diffracts around obstructions​.

Portaling

Project Triton sound portaling

Sound heard “through the wall” (red) from source direction is incorrect​. Doors funnel sounds, which is more realistic​.


Occlusion

Project Triton sound occlusion

Occlusion is total reduction in loudness from geometry, involving complex propagation and diffraction (diffracted paths in green)​.

Reverberance

Project Triton sound reverberance

Left: Close to the source, direct path (green) is loud compared to reflections (orange), resulting in high clarity and low reverberance. Right: Behind the partition, direct path is weakened from diffraction, causing low clarity and high reverberance, conveying that the source is in a different room.​


Decay Time

Project Triton sound decay time

Larger rooms reverberate longer.​

Project Triton automatically renders believable environmental effects that transition smoothly as the player moves through the world, illustrated above. These wave effects, such as obstruction and portaling, involve wave diffraction, making them challenging to compute within the tight CPU budget for games. Because of this, audio designers have traditionally built such auditory experiences manually, a tedious and expensive process. Project Triton automates this tedium while retaining designer control. The designer can then modify acoustics for storytelling goals, for example, reduce reverberance to improve speech intelligibility, or increase reverb decay time to make a cave feel spookier. It all fits within ~10 percent of a CPU core.

How it works

How Project Triton Works

Project Triton models how sound waves actually travel through complex 3D spaces, diffracting around corners and through doorways, reverberating in various rooms, responding to each triangle’s material. This computation is extremely expensive, so it is precomputed on static visual 3D geometry on a compute cluster in a “baking” step. The overall pipeline is quite analogous to light baking, moving expensive global propagation computation to a baking step rather than during gameplay where CPU is limited. This data is passed through a proprietary parametric compressor that drastically reduces data size and enables fast lookup and signal processing at runtime, allowing Triton to run even on mobile devices such as the Oculus Go.

This 2017 Game Developers Conference (GDC) talk provides a general introduction to the ideas behind Project Triton. For technical details, consult this SIGGRAPH 2014 paper containing the main ideas, and this 2018 paper extending it to modeling directional audio effects such as portaling.

Visit Project Acoustics for official documentation and downloads.

Product transfers

People

Microsoft Research

Microsoft Cloud + AI (Project Acoustics)

  • Portrait of Noel Cross

    Noel Cross

    Principal Software Engineering Lead

  • Portrait of Hakon Strande

    Hakon Strande

    Principal Program Manager

  • Portrait of Keith Godin

    Keith Godin

    Senior Scientist

  • Portrait of Mike Chemistruck

    Mike Chemistruck

    Senior Software Engineer

  • Portrait of Ashu Tatake

    Ashu Tatake

    Principal Software Engineer

  • Portrait of Lyle Corbin

    Lyle Corbin

    Principal Software Engineer

  • Portrait of Kyle Storck

    Kyle Storck

    Software Engineer II

  • Portrait of Panu Koponen

    Panu Koponen

    Software Engineer II

The Coalition Studio (Gears of War)

  • Portrait of John Morgan

    John Morgan

    Audio Director

  • Portrait of John Tennant

    John Tennant

    Lead Audio Designer

  • Portrait of Jimmy Smith

    Jimmy Smith

    Senior Software Engineer

  • Portrait of Clarence Chu

    Clarence Chu

    Senior Software Engineer

RARE Studio (Sea of Thieves)

  • Portrait of Jon Vincent

    Jon Vincent

    Audio Director

Videos

Project Acoustics: Making Waves with Triton

Project Acoustics: Making Waves with Triton

Project Acoustics is now available for all game developers and sound designers to use. It employs the Triton technology developed in Microsoft Research for accurate sound propagation using wave physics.…
See more >

Parametric Directional Coding for Precomputed Sound Propagation

Parametric Directional Coding for Precomputed Sound Propagation

Convincing audio for games and virtual reality requires modeling directional propagation effects. The initial sound’s arrival direction is particularly salient and derives from multiply-diffracted paths in complex scenes…
See more >


Gears of War 4, Project Triton: Pre-Computed Environmental Wave Acoustics

Gears of War 4, Project Triton: Pre-Computed Environmental Wave Acoustics

In this 2017 GDC talk, Microsoft’s Nikunj Raghuvanshi and John Tennant discuss both the technical and design aspects of Project Titan, a new audio system that robustly models complex wave phenomena such as diffraction…
See more >

Parametric Directional Coding for Precomputed Sound Propagation

Parametric Wave Field Coding for Precomputed Sound Propagation

The acoustic wave field in a complex scene is a chaotic 7D function of time and the positions of source and listener, making it difficult to compress and interpolate. This hampers precomputed approaches which tabulate impulse…
See more >


Wave-Based Sound Propagation in Large Open Scenes using an Equivalent Source Formulation

Wave-Based Sound Propagation in Large Open Scenes using an Equivalent Source Formulation

We present a novel approach for wave-based sound propagation suitable for large, open spaces spanning hundreds of meters, with a small memory footprint. The scene is decomposed into disjoint rigid objects. The free-field acoustic…
See more >

Sound Synthesis for Impact Sounds in Video Games

Sound Synthesis for Impact Sounds in Video Games

We present an interactive system for synthesizing high quality, physically based audio on current video game consoles. From a recorded impact sound, we compute a modal model, which we use to synthesize variations of the sound…
See more >


Gears of War 4, Project Triton: Pre-Computed Environmental Wave Acoustics

Precomputed Wave Simulation for Real-Time Sound Propagation of Dynamic Sources in Complex Scenes

We present a method for real-time sound propagation that captures all wave effects, including diffraction and reverberation, for multiple moving sources and a moving listener in a complex, static 3D scene…
See more >

Aerophones in Flatland: Interactive Wave Simulation of Wind Instruments

Aerophones in Flatland: Interactive Wave Simulation of Wind Instruments

We present the first real-time technique to synthesize full bandwidth sounds for 2D virtual wind instruments. A novel interactive wave solver is proposed that synthesizes audio at 128,000Hz on commodity graphics cards. Simulating…
See more >

In the news