VisualPanel

Established: November 15, 2000

Transforming an ordinary paper into a wireless mobile input device.

Virtual mouse, keyboard and 3D controller with an ordinary piece of paper.

Abstract

In many intelligent environments, instead of using conventional mice, keyboards and joysticks, people are looking for an intuitive, immersive and cost-efficient interaction device. We are developing a vision-based gesture interface prototype system, VisualPanel, which employs an arbitrary quadrangle-shaped panel (e.g., an ordinary paper) and a tip pointer (e.g., fingertip) as an intuitive, wireless and mobile input device. The system can accurately and reliably track the panel and the tip pointer. The panel tracking continuously determines the projective mapping between the panel at the current position and the display, which in turn maps the tip position to the corresponding position on the display. By detecting the clicking and dragging actions, the system can fulfill many tasks such as controlling a remote large display, and simulating a physical keyboard. Users can naturally use their fingers or other tip pointers to issue commands and type texts. Furthermore, by tracking the 3D position and orientation of the visual panel, the system can also provide 3D information, serving as a virtual joystick, to control 3D virtual objects. The system, which runs at around 22Hz on PIII 800MHz PC, is scalable and extensible. Further potential applications include multiple-persons interactions.

People

Publications

2001

2000

Videos

‚Äč