The Archives

The Archives

A Collection Of Oldies-But-Goodies


Pico Play Anywhere
Pico Play Anywhere
Andy Wilson, Steven Bathiche, LightBlue Optics inc. 2009.
This demonstration captures touch information by light reflected off a laser illuminated plane and uses that to interact with a pico-projected display on the same surface.


Eye to Eye telepresence using a time multiplexed switching diffuser screen.
Dan Rosenfeld, Bill Buxton. 2008.
This demonstration uses the Microsoft Applied Sciences’ wedge technology to display an image on a diffuse screen for the user to look. It rapidly alternates this with a clear state of the screen so that a camera can capture an image of the user through the screen (ie looking at the camera). This enables an eye-to-eye teleconferencing scenario when two such devices are connected to each other.

Foldable Displays (tracked with the Wiimote)
Lee, J., Hudson, S. 2008
This is a concept demo of what would be possible once a high-speed hybrid infrared and visible light projector were available. DLP technology has the potential to perform high-speed tracking simultaneously with image projection. With this type of location tracking, we could track points on non-rigid geometries and project accurately onto flexible and foldable surfaces as well as obtain stylus input. This achieves a vision commonly found in science-fiction films where an individual can summon a large display from a pocket-sized device.

Tracking fingers with the Wii Remote
Johnny Lee. 2008.
Using an LED array and some reflective tape, you can use the infrared camera in the Wii remote to track objects, like your fingers, in 2D space. This lets you interact with your computer simply by waving your hands in the air similar to the interaction seen in the movie "Minority Report". The Wiimote can track up to 4 points simultaneously. The multipoint grid software is a custom C# DirectX program.

Low-Cost Multi-touch Whiteboard using the Wiimote
Johnny Lee. 2008.
Since the Wiimote can track sources of infrared (IR) light, you can track pens that have an IR LED in the tip. By pointing a wiimote at a projection screen or LCD display, you can create very low-cost interactive whiteboards or tablet displays. Since the Wiimote can track upto 4 points, up to 4 pens can be used. It also works great with rear-projected displays.

Head Tracking for Desktop VR Displays using the WiiRemote
Johnny Lee. 2008.
Using the infrared camera in the Wii remote and a head mounted sensor bar (two IR LEDs), you can accurately track the location of your head and render view dependent images on the screen. This effectively transforms your display into a portal to a virtual environment. The display properly reacts to head and body movement as if it were a real window creating a realistic illusion of depth and space.


Surface based Optical Finger Tracker
John Lutian. 2007.
A very high resolution optical finger tracker for controlling the mouse cursor in tight spaces (where a normal mouse can't be moved).

Legend Morphing using front projection on to physical input devices
Steven Bathiche, Andy Wilson. 2007.
This demonstrates a keyboard whose legend adapts based on the context of the application and user need. This combines a tactile feel with the adaptability of virtual projected images and improves over earlier demonstrations with a full color, high resolution, and fully dynamic content display.


Orb: 3D control using inertial navigation
Steven Bathiche, Andy Wilson. 2005.
This demonstrates a 3D input device which allows you to rotate a virtual object in roll, pitch, and yaw as if you were actually holding it in real life. And, with no wires!

Spotlight: Projector + inertial sensing
Steven Bathiche, Andy Wilson. 2005
This demonstration uses the 3D inertial input from the orb demonstration to control a small projector, in effect creating a virtual flashlight on a navigation application.

Magnetic Tracking for pointer input
Craig Ranta, Steven Bathiche. 2005.
This demonstration uses a Honeywell magnetometer to track a magnet in 2D space. The magnet location information is used as a high resolution cursor controller.

Optical Finger Tracker: sensing fingers from edge of Keyboard
Randy Kong. 2005.
This demonstration uses edge illumination slightly above the desk surface, as could be embedded in the side of a keyboard, to track fingers for touch input.

Warp Pointer: a large screen input device
Andy Wilson, Mike Sinclair, Gary Starkweather, Steven Bathiche. 2005.
This demonstration looks into cursor control on very large displays. By using a camera to track a wand with embedded inertial sensors, it enabled both control of absolute position over the entire display and precise relative control within one region of the display.

Teddy: A personal Computer based Kids Companion
Steven Bathiche, Kris Nye, Andy Wilson, John Hershey. 2005
This demonstration uses sight and sound to give the computer the ability to know who is currently of interest (and to whom priority attention should be given). More generally, it explored interacting with a computer without the use of a keyboard, mouse, or display. It uses an array microphone to locate sounds and face tracking to look for people.

Moveable Projected Displays using Projector Based Tracking
Lee, J., Hudson, S., Summet, J., and Dietz, P. 2005.
By projecting smaller patterns over the discovered locations of the sensors, we can obtain location updates sufficiently fast to do interactive tracking of hand-held surfaces and objects. Additionally, small patterns free up pixels that can be used for application content. This work also describes a technique for projecting the patterns in a frequency modulated (FM) manner such that they are imperceptible to the human eye. The result is that the patterns appear as solid gray squares eliminating the high-contrast black and white patterns from before.


Internet Controlled Telepresence Robot
Steven Bathiche. 2004
This demonstration investigated using robotics to enable spontaneous video conference sessions by allowing the camera and display to be controlled and moved remotely.

Transparent Screen: Projecting onto a Laminar flow of water
Steven Bathiche. 2004
This demonstrates a transparent rear-projection screen made out of water vapor.

Transparent Puzzle Pieces on Surface
Steven Bathiche, Andy Wilson, David Kurlander. 2004
This demonstrates an interactive video puzzle on Microsoft Surface. By using IR tags, the Surface can locate the position, rotation, and inversion of the transparent puzzle pieces and display the appropriate video content, enabling a game that mixes real world objects with virtual content.

Haptic Pen
Lee, J., Dietz, P., Leigh, D., Yerazunis, W., and Hudson, S., 2004
A device for providing tactile feedback in a stylus that is able to simulate the sensations of pressing a physical button.

Automatic Projector Calibration with Embedded Light Sensors
Lee, J., Dietz, P., Aminzade, D., and Hudson, S. 2004
This video demonstrates a target screen fitting application. It goes on to demonstrate how this approach can be used in multi-projector applications such as stitching (creating a large display using tiled projection) or layering (multiple versions of content on the same area for view dependent displays). Additionally, it can be used to automatically register the orientation of 3D surfaces for augmenting the appearance of physical objects.


First prototype of Microsoft Surface
First prototype of Microsoft Surface
Steven Bathiche, Andy Wilson, David Kurlander. 2003
This demonstrates some of the mixed reality applications of the first prototype of Microsoft Surface. In one scene, we see a painting application that uses ordinary paint brushes to apply virtual paint. Later, we see a puzzle application that uses physical pieces to move and orient video content on the screen. Finally, we see a tank game that allows the player to move physical toys that the Surface sees and incorporates into the game. In addition to understanding the board state (walls and tanks), the Surface also interacts in the real world via the movement of the computer controlled (physical) tank.


iBlocks: Tangible and virtual building blocks with real time digitization.
Steven Bathiche.2002.
This demonstrates real-time computer aided modeling that mixes real world objects with virtual interactivity.

Hover Surface Tracking using Laser Speckle
Steven Bathiche. 2002.
This demonstration tracks scattered light from a laser off of a surface (ie laser speckle) which enables a robust optical surface tracking engine that has hover capabilities.


Legend Morphing Keys using electroluminescent stacks
Steven Bathiche. 2000
This demonstration uses low cost transparent electroluminescent pre-printed tags to create legends that adapt and keyboard functionality to change based on application context.