Fusion of Optical and Radio Frequency Techniques: Cameras, Projectors and Wireless Tags


June 4, 2007


Ramesh Raskar




Using a combination of techniques in optical and radio frequency domains, one can significantly improve the functionality of cameras for sensing, projectors for augmentation and RFIDs for location sensing services. We have recently developed a technique to capture a light field and improve the depth of field of a camera using heterodyning methods common in radio frequency modulation (http://www.merl.com/people/raskar/Mask/). We have also shown that sensor-enhanced wireless tags can be precisely located at 500 times/second by exploiting the epipolar geometry of projectors (http://www.merl.com/people/raskar/LumiNetra/ ). The talk will explore the implication of this fusion in Computational Photography, Motion Capture, Augmented Reality and Displays.


Ramesh Raskar

Ramesh Raskar is a Senior Research Scientist at MERL. His work spans a range of topics in computer vision and graphics including computational photography, projective geometry, non-photorealistic rendering and intelligent user interfaces. Current projects include optical heterodyning photography, flutter shutter camera, composite RFID (RFIG), multi-flash non-photorealistic camera for depth edge detection, locale-aware mobile projectors, high dynamic range video, image fusion for context enhancement and quadric transfer methods for multi-projector curved screen displays.Dr. Raskar received the TR100 Award, Technology Review’s 100 Top Young Innovators Under 35 worldwide, 2004 and Global Indus Technovator Award 2003, instituted at MIT to recognize the top 20 Indian technology innovators on the globe. He holds 25 US patents and has received Mitsubishi Electric Invention Awards in 2003, 2004 and 2006. He is a member of the ACM and IEEE.