Code Space: Combining Touch, Devices, and Skeletal Tracking to Support Developer Meetings

Proceedings of ITS 2011 |

Published by ACM

We present Code Space, a system that contributes touch + air gesture hybrid interactions to support co-located, small group developer meetings by democratizing access, control, and sharing of information across multiple personal devices and public displays. Our system uses a combination of a shared multi-touch screen, mobile touch devices, and Microsoft Kinect sensors. We describe cross-device interactions, which use a combination of in-air pointing for social disclosure of com-mands, targeting and mode setting, combined with touch for command execution and precise gestures. In a formative study, professional developers were positive about the interaction design, and most felt that pointing with hands or devices and forming hand postures are socially acceptable. Users also felt that the techniques adequately disclosed who was interacting and that existing social protocols would help to dictate most permissions, but also felt that our lightweight permission fea-ture helped presenters manage incoming content.

Code Space: Combining Touch, Devices, and Skeletal Tracking to Support Developer Meetings

Video of the Code Space system, as presented at the Interactive Tabletops & Surfaces 2011 conference.