The hands-free music project is a participatory design collaboration with members of the ALS community.
Our mission is to restore critical expressive & creative channels for people severely affected by disabling conditions, such as ALS, spinal cord injury, and related conditions with potential to isolate and erect innumerable, formidable barriers between people and their loved ones and communities.
We believe in the power of music to heal, connect, uplift, transcend, transform, and nurture the spirit–and we reject the notion that making music, performing live, playing instruments, expressing oneself through sound, and communing creatively with others are the exclusive domain of the able-bodied and limited by conventional laws of physics.
We believe that freedom and joy can be found in the most constrained of human experiences, and that combining empathy, compassion, and commitment to a higher good with creative combinations of technology and human-centered design can help break down the barriers erected by disease and disability. We aim to restore, redefine and reinvent the music-making experience for people who are “locked in” or are otherwise affected by severe speech and mobility impairment.
To this end, we present three complimentary technologies, colloquially known as Red-Eye Sound Studio: Microsoft Hands-Free Sound Jam, Microsoft Hands-Free Sound Machine, and Microsoft Expressive Pixels. Sound Jam and Sound Machine are eye-controlled applications for music performance and composition. Expressive Pixels is a visual platform that can be leveraged to augment stage performance, playback, and artist expression.
Microsoft Hands-Free Sound Jam is an eye-controlled music environment for electronic loop-based performance and composition. It is designed using familiar design paradigms found in commercial music production software, which have been adapted to work well with eye control. The core interface revolves around what is known as a clip launcher, inspired by the the one found in the popular digital audio workstation Ableton Live. Using the Sound Jam clip launcher, one is able perform a piece of music by scheduling, or “launching”, small musical fragments, known as “clips”, which are automatically aligned and quantized to the next downbeat of the song. Editing capabilities are provided through a editing window, where clips can be modified or rewritten entirely. Users can create custom sample sets, or save clips and combine them to form full-length compositions.
Sound Jam uses the PCEye Mini via the Tobii Stream Engine API for eye tracking. Graphics are drawn using OpenGL via NanoVG, a lightweight 2d vector graphics library. Persistent data storage is provided via SQLite. Realtime audio is achieved using RtAudio, and digitial signal processing is built on top of the musical audio signal processing library Soundpipe.
Read more about Sound Jam…
Microsoft Hands-Free Sound Machine is an eye-controlled, 16-step sound sequencer that provides users the ability to generate musical compositions via a combination of .wav and midi samples, and supports output to physical instruments as well as stage effects.
Physical instrument playback (with supporting visual affordances) are supported, as well as standard .midi and .wav output.
One of the things we experimented with in this release was converting pre-recorded .wav clips of PALS* voices, which had been “banked” for conventional use with an AAC** system into sample sets to be adapted and used as instruments for artist.
Composition and semi-live, loop-based playback are supported via a single-state dwell-based interface, and users can save and combine multiple clips into full-length compositions.
Read more about Sound Machine…
Microsoft Expressive Pixels is a platform that enables the authoring and rendering of static and animated visualizations on NeoPixel and other addressable LED displays, for Makers and Professionals alike. The Expressive Pixels suite consists of an authoring tool tailored to producing visuals for LED matrixed displays; Firmware and its Opened Source code tailored to run on ARM Cortex M0 and above embedded Arduino platforms; Reference design and schematic for the Expressive Pixel Emoticon LED display hardware; Azure based webservice backend for the sharing and distribution of visualizations in the interested community.
For this demo, we will use the displays as a visual augmentation to the physical instrument output in Sound Machine. The custom designed light show will sync with the beat of each percussive instrument, thus creating a visual affordance for each physical instrument.
Read more about Expressive Pixels…
Ann Paradiso (project director and founder), Dwayne Lamb, Paul Batchelor, Gavin Jancke, Jon Campbell, Arturo Toledo, Alejandro Toledo, Shaun Kane, Chuck Needham, Tambie Angel, Chris O’Dowd, Irina Spiridonova, Jamie Rifley, John Romualdez, Jeremy Best, Harish Kulkarni
Special thanks to Rico Malvar, Jay Beavers, Maggie Duffield, Pete Ansell, Shane Williams, Noelle Sophy, Rich Eizenhoefer, Sheridan Jones, John Ransier, Darren Gehring, Henry Honig, Graham Reeve, Hallie Phillips, Jeff Pedersen, Jenny Lay-Flurrie, Steve Gleason, Jackie Gaddis, Austin Edenfield, Anne Eichmeyer, Nathaniel Swenson, Rebecca Danner, Bill Buxton, Dr. Mike Elliott, Team Gleason, ALS Association Evergreen Chapter, the Microsoft Research Redmond Advanced Development Team, and the Microsoft AI&R Enable Team.