At Game Developers Conference 2019, we shared an early peek at Responsive Spatial Audio for Immersive Gaming, a Microsoft Garage project. The Unity plug-in helps developers infuse accessibility into games by making it easy to annotate game objects with descriptive text and present it to players through interactive audio cues. The project is now available worldwide in the Unity Store.
Baking accessibility into game development
A number of hackers have joined the cause to make games more accessible. For example, Ear Hockey, a Microsoft Garage project, is a game designed around the blind and low vision community, and the Xbox Adaptive Controller, a Hackathon project turned Garage Wall of Famer, is a game controller designed for gamers of all abilities. The Garage project team members who developed Responsive Spatial Audio are taking a different approach, focusing on the game developer by baking accessibility right into an easy, drag-and-drop interaction toolkit.
With Responsive Spatial Audio, game developers can tag 3D objects with descriptive text, and the experience captures these tags and spatial coordinates to help players navigate. As players traverse through the game world and encounter tagged objects and designated points of interest, they are guided by audio cues via a built-in, text-to-speech API. An accessible FPS controller presents relevant descriptions at the right time by monitoring player movement, scanning their surroundings for metadata, and cuing spatial audio guidance for objects in the frame of view.
Key features to provide a more accessible experience
Responsive Spatial Audio offers a number of features that make prioritizing accessibility easy.
Accessible FPS Controller Convey object descriptions within the player’s frame of view via audio cues and adjust the viewing frustum length and arc
Annotate Game Objects Tag and manage objects with descriptive text—tag once and descriptions appear everywhere the object does
Vantage Point Objects Add and manage vantage points, or invisible doorframe-like points of interests that convey a whole view (as opposed to objects within the viewing frustum). Present different descriptions based on the direction the player is facing
Accessible Navigation Aid player navigation with a suite of interaction tools including:
- Guide players to a selected object via a navigation agent with an orientation and spatial beacon
- Add a script to guide players to nearby vantage points with auditory beeps
- Enable bump noises with custom sounds, that will play spatial audio upon collision, intelligently based on the orientation of the player
- Change background audio based on the location of the player
- Indicate the global north and south of the game with spatial sound
Inventory UI Leverage an optional in-box inventory UI to easily manage a library of game objects
To see how you can incorporate Responsive Spatial Audio into your games, see the project in action in a demo accompanying the plugin in the Unity Store.
One step closer to seamless, accessible development
We sat down with Brannon Zahand and Evelyn Thomas, each Senior Program Managers in Accessibility R&D who champion accessibility in the gaming space, to hear their reflections on the project. “The idea that I can drag and drop this into a game, with very little work to implement it, is a game changer for the industry” shared Brannon. Evelyn attended GDC 2019 to talk to developers about best practices in accessibility, highlighting the project at a conference talk and Microsoft’s accessibility booth.
“The idea that I can drag and drop this into a game, with very little work to implement it, is a game changer for the industry.”
Responsive Spatial Audio was developed by Manohar Swaminathan, a Senior Researcher in Microsoft Research, based in Bangalore, India. Manohar has been working in graphics for years, but found a passion for accessibility while working on CodeTalk, a solution that empowers developers in the blind and low vision community to do more with Visual Studio. He was searching for ways to do more impactful work in India when he met and teamed up with former Research Fellow Venkatesh Potluri, a blind developer who was interested in enhancing his productivity. After releasing CodeTalk, Manohar was inspired to combine his background with games and VR to make the gaming space more accessible through audio. “We thought ‘Can we use rich, spatial audio content to replace the visual information that is missing?” and decided to give it a shot,” he shared. It’s Manohar’s hope that plug-and-play tools will inspire developers to create fun and inclusive game experiences accessible to all.