Pre-Touch Sensing for Mobile Interaction

Ken Hinckley, Seongkook Heo, Michel Pahud, Christian Holz, Hrvoje Benko, Abigail Sellen, Richard Banks, Kenton O'Hara, Gavin Smyth, Bill Buxton

CHI '16 Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems |

Published by ACM

View Publication | View Publication

Touchscreens continue to advance—including progress towards sensing fingers proximal to the display. We explore this emerging pre-touch modality via a self-capacitance touchscreen that can sense multiple fingers above a mobile device, as well as grip around the screen’s edges. This capability opens up many possibilities for mobile interaction. For example, using pre-touch in an anticipatory role affords an “ad-lib interface” that fades in a different UI—appropriate to the context—as the user approaches one-handed with a thumb, two-handed with an index finger, or even with a pinch or two thumbs. Or we can interpret pre-touch in a retroactive manner that leverages the approach trajectory to discern whether the user made contact with a ballistic vs. a finely-targeted motion. Pre-touch also enables hybrid touch + hover gestures, such as selecting an icon with the thumb while bringing a second finger into range to invoke a context menu at a convenient location. Collectively these techniques illustrate how pre-touch sensing offers an intriguing new back-channel for mobile interaction.

Pre-Touch Sensing for Mobile Interaction

New research uses a mobile phone’s ability to sense how you are gripping the device, as well as when and where the fingers are approaching it, to adapt interfaces on the fly. The research is outlined in the paper, "Pre-Touch Sensing for Mobile Interaction."

Pre-Touch Sensing for Mobile Interaction (CHI 2016 Live Presentation)

Touchscreens continue to advance—including progress towards sensing fingers proximal to the display. We explore this emerging pre-touch modality via a self-capacitance touchscreen that can sense multiple fingers above a mobile device, as well as grip around the screen’s edges. This capability opens up many possibilities for mobile interaction. For example, using pre-touch in an anticipatory role affords an “ad-lib interface” that fades in a different UI—appropriate to the context—as the user approaches one-handed with a thumb, two-handed with an index finger, or even with a pinch or two thumbs. Or we can interpret pre-touch in a retroactive manner that leverages the approach trajectory to discern whether the user made contact with a ballistic vs. a finely-targeted motion. Pre-touch also enables hybrid touch + hover gestures, such as selecting an icon with the thumb while bringing a second finger into range to invoke a context menu at a convenient location. Collectively these techniques illustrate how pre-touch sensing offers an intriguing new back-channel for mobile interaction.