Always-Available Mobile Interfaces




We have continually evolved computing to not only be more efficient, but also more accessible, more of the time (and place), and to more people. We have progressed from batch computing with punch cards, to interactive command line systems, to mouse-based graphical user interfaces, and more recently to mobile computing. Each of these paradigm shifts has drastically changed the way we use technology for work and life, often in unpredictable and profound ways.

With the latest move to mobile computing, we now carry devices with significant computational power and capabilities on our bodies. However, their small size typically leads to limited interaction space (diminutive screens, buttons, and jog wheels) and consequently diminishes their usability and functionality. This presents a challenge and an opportunity for developing interaction modalities that will open the door for novel uses of computing.

Our work addresses these challenges by appropriating both the human body and the surrounding environment as interaction canvases. We achieve this by leveraging sensors used in medical contexts, and by applying signal processing and machine learning techniques that extract data about gesture and human behavior from those sensors.



Workout: Automatic Exercise Analysis

Although numerous devices exist to track and share exercise routines based on running and walking, these devices offer limited functionality for strength-training exercises. We introduce a system for automatically tracking repetitive exercises — such as weight training and calisthenics — via an arm-worn inertial sensor, with no user-specific training and no intervention during a workout.


Non-Contact Haptic Feedback Using Air Vortex Rings

We explore the use of air vortex rings to enable at-a-distance haptics. Unlike standard jets of air, which are turbulent and dissipate quickly, vortex rings can be focused to travel several meters and impart perceptible feedback.


Enabling Mobile Phones to Infer Where They Are Kept

We collected data from 693 participants to understand where people keep their phone in different contexts and why. Using this data, we identified three placement personas: Single Place Pat, Consistent Casey, and All-over Alex. We also built prototypes employing capacitive, multispectral, and accelerometer sensing to infer phone placements automatically.


Using the Doppler Effect to Sense Gestures

We present SoundWave, a technique that leverages commodity speakers and microphones to sense in-air gestures. We generate an inaudible tone, which gets frequency-shifted when it reflects off of moving objects; we measure this shift with the microphone to infer various gestures.


Sensing Gestures Using the Body as an Antenna

Home environments frequently offer a signal that is unique to locations and objects within the home: electromagnetic noise. In this work, we use the body as a receiving antenna and leverage this noise for gestural interaction.



PocketTouch utilizes capacitive sensing to detect finger-strokes through fabric (e.g., while your phone is still in your pocket).


Muscle-Computer Interfaces

Muscle-computer interfaces directly sense and decode human muscular activity rather than relying on physical actuation or perceptible user actions. We believe that this is the first step towards tapping into the vast amount of information contained within the human physiology.


Skinput: Bioacoustic Sensing for Input

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body.




Control your laptop with a wave of your hand

CNN Money, 7 August 2012

Next Up in Kinect-Style Motion Sensing: Ultrasound?

Popular Mechanics, 25 May 2012

Microsoft Turns Jazz Hands Into Gesture Commands Using Sound Waves

IT World, 10 May 2012

Microsoft Research Projects Offer New Takes on Gesture Sensing

IDG News Service, 9 May 2012

Beyond Kinect: Gestural Computer Spells Keyboard Death

New Scientist, 9 May 2012

Gesture Sensing Alternatives Use Radio Interference, Doppler Effect

PC World, 9 May 2012

Laptop Uses Sound for Gesture Control

Discovery News, 9 May 2012

Cool Microsoft Research Takes Kinect To Another Level

PC Magazine, 7 May 2012

Gesture Control System Uses Sound Alone

Technology Review, 7 May 2012

Here’s Looking at You (but I’m Still Texting)

New York Times, 11 Feb 2012

Stealth Texting

Technology Review, 1 Jan 2012

10 Tech Research Projects to Watch (featuring PocketTouch)

PC World, 3 Jan 2012


How to Make a Human Antenna (at ABC News) (at MSNBC)

Discovery News, 12 May 2011

Turn your entire home into a game controller

New Scientist, 10 May 2011

Talking to the Wall

Technology Review, 3 May 2011


Microsoft’s Skinput turns hands, arms into buttons

CNN, 19 April 2010

Skinput Makes the Entire Body a Touch Interface

PC World, 13 April 2010

Sensors turn skin into gadget control pad

BBC News, 26 March 2010

Body acoustics can turn your arm into a touchscreen

New Scientist, 1 March 2010

‘Skinput’ Turns Your Body Into Touchscreen Interface

TechNews, 3 March 2010

Skinput Turns Any Bodily Surface Into a Touch Interface

Popular Science (, 3 March 2010

Skinput Turns Your Arm into a Touch-Screen

Wired, 3 March 2010


Muscle-Based PC Interface Lets You Literally Point and Click, No Mouse Required

Popular Science (, 29 October 2009

Muscle-Bound Computer Interface

MIT Technology Review, 28 October 2009

The Quest for a Better Keyboard

Forbes Magazine, September 2009


High-tech Armband Puts your Fingers in Control

The New Scientist, 24 April 2008