We have continually evolved computing to not only be more efficient, but also more accessible, more of the time (and place), and to more people. We have progressed from batch computing with punch cards, to interactive command line systems, to mouse-based graphical user interfaces, and more recently to mobile computing. Each of these paradigm shifts has drastically changed the way we use technology for work and life, often in unpredictable and profound ways.
With the latest move to mobile computing, we now carry devices with significant computational power and capabilities on our bodies. However, their small size typically leads to limited interaction space (diminutive screens, buttons, and jog wheels) and consequently diminishes their usability and functionality. This presents a challenge and an opportunity for developing interaction modalities that will open the door for novel uses of computing.
Our work addresses these challenges by appropriating both the human body and the surrounding environment as interaction canvases. We achieve this by leveraging sensors used in medical contexts, and by applying signal processing and machine learning techniques that extract data about gesture and human behavior from those sensors.
Projects
Video
People
People
Scott Saponas
Senior Director
Desney Tan
Vice President and Managing Director, Microsoft Health Futures
Gabe Cohn
Principal Researcher
Sidhant Gupta
Director, Product Incubation
A.J. Brush
Partner Group Program Manager