Optically Sensing Tongue Gestures for Computer Input
- Scott Saponas | Microsoft Research
Many patients with paralyzing injuries or medical conditions retain the use of their cranial nerves, which control the eyes, jaw, and tongue. While researchers have explored eye-tracking and speech technologies for these patients, we believe there is potential for directly sensing explicit tongue movement for controlling computers. In this talk, we describe a novel approach of using infrared optical sensors embedded within a dental retainer to sense tongue gestures. We describe an experiment showing our system effectively discriminating between four simple gestures with over 90% accuracy. In this experiment, users were also able to play the game Tetris with their tongues. Finally, we present lessons learned and opportunities for future work.
Speaker Details
Dr. T. Scott Saponas is a researcher in the Computational User Experiences group within the Visualization and Interaction Area at Microsoft Research in Redmond, WA. He earned a Ph.D. in Computer Science and Engineering from the University of Washington in 2010 and a M.S. in 2006 where he was advised by Professor James Landay and Dr. Desney Tan. He was honored as one of Technology Review’s 2010 Young Innovators Under 35. In 2008, he was awarded a Microsoft Research Graduate Fellowship. His research interests include Human-Computer Interaction (HCI) and Physiological Computing. Scott received a B.S. in Computer Science from the Georgia Institute of Technology in 2004, advised by Professor Gregory Abowd.
-
-
Jeff Running
-
Scott Saponas
Senior Director
-
Watch Next
-
-
-
-
Accelerating MRI image reconstruction with Tyger
- Karen Easterbrook,
- Ilyana Rosenberg
-
-
-
-
From Microfarms to the Moon: A Teen Innovator’s Journey in Robotics
- Pranav Kumar Redlapalli
-
-