I am a senior researcher in Natural Interaction Research group at Microsoft Research. My research is in the field of Human Computer Interaction. My interests include:

  • augmented and virtual reality,
  • haptics,
  • interactive projection mapping,
  • new input form factors and devices,
  • touch and freehand gestural input.

I am passionate about creating highly interactive real-time demonstration systems that highlight the core innovations while at the same time surprising and delighting the user. In my work, I try to strike a balance between long horizon research and solutions that are directly applicable to products in the near term.

More information can be found on my external webpage.


Sparse Haptic Proxy

Established: May 4, 2017

  The user of Virtual Reality, may find himself in different virtual worlds, such as a spy game (middle) or a space simulator(right) , yet they all give him tangible feedback using the same physical geometry in the real world (left). We propose a class of passive haptics that we call Sparse Haptic Proxy: a set of geometric primitives that simulate touch feedback in elaborate virtual reality scenes. Unlike previous passive haptics that replicate the…

Enhancing Input On and Above the Interactive Surface with Muscle Sensing

Current interactive surfaces provide little or no in-formation about which fingers are touching the surface, the amount of pressure exerted, or gestures that occur when not in contact with the surface. These limitations constrain the interaction vocabulary available to interactive surface systems. In our work, we extend the surface interaction space by using muscle sensing to provide complementary information about finger movement and posture. In this paper, we describe a novel system that combines muscle…

Muscle-Computer Interfaces (muCIs)

Established: November 28, 2016

Many human-computer interaction technologies are currently mediated by physical transducers such as mice, keyboards, pens, dials, and touch-sensitive surfaces. While these transducers have enabled powerful interaction paradigms and leverage our human expertise in interacting with physical objects, they tether computation to a physical artifact that has to be within reach of the user. As computing and displays begin to integrate more seamlessly into our environment and are used in situations where the user is not…

Room2Room: Life-Size Telepresence in a Projected Augmented Reality Environment

Established: March 2, 2016

Room2Room is a life-size telepresence system that leverages projected augmented reality to enable co-present interaction between two remote participants. We enable a face-to-face conversation by performing 3D capture of the local user with color + depth cameras and projecting their virtual copy into the remote space at life-size scale. This creates an illusion of the remote person’s presence in the local space, as well as a shared understanding of verbal and non-verbal cues (e.g., gaze).

Rich Haptic Feedback in Virtual Reality

Established: July 1, 2015

Publications Hrvoje Benko, Christian Holz, Mike Sinclair, and Eyal Ofek NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers. In Proc. of ACM UIST 2016. PDF Video     Mahdi Azmandian, Mark Hancock, Hrvoje Benko, Eyal Ofek, and Andrew Wilson. Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences. In Proc. of ACM CHI 2016. 

Tablet and Stylus Interaction

Established: October 3, 2014

We explore grip and motion sensing to afford new techniques that leverage how users naturally manipulate tablet and stylus devices during pen-and-touch interaction. We can detect whether the user holds the pen in a writing grip or tucked between his fingers. We can distinguish bare-handed inputs, such as drag and pinch gestures, from touch gestures produced by the hand holding the pen, and we can sense which hand grips the tablet, and determine the screen's…

Dyadic Mano-a-Mano

Established: October 3, 2014

Mano-a-Mano is a unique spatial augmented reality system that combines dynamic projection mapping, multiple perspective views and device-less interaction to support face-to-face, or dyadic, interaction with 3D virtual objects. Its main advantage over more traditional AR approaches is users are able to interact with 3D virtual objects and each other without cumbersome devices that obstruct face to face interaction. ACM Digital Download Dyadic projected spatial augmented reality  

RoomAlive Toolkit

Established: October 1, 2014

The RoomAlive Toolkit is an open source SDK that enables developers to calibrate a network of multiple Kinect sensors and video projectors. The toolkit also provides a simple projection mapping sample that can be used as a basis to develop new immersive augmented reality experiences similar to those of the IllumiRoom and RoomAlive research projects. The RoomAlive Toolkit is provided as open source under the MIT License. The code is available for download at GithHub: https://github.com/Kinect/RoomAliveToolkit.…


RoomAlive is a proof-of-concept prototype that transforms any room into an immersive, augmented, magical entertainment experience. RoomAlive presents a unified, scalable approach for interactive projection mapping that dynamically adapts content to any room. Users can touch, shoot, stomp, dodge and steer projected content that seamlessly co-exists with their existing physical environment.

IllumiRoom: Peripheral Projected Illusions for Interactive Experiences

Established: January 4, 2013

IllumiRoom is a proof-of-concept system from Microsoft Research. It augments the area surrounding a television screen with projected visualizations to enhance the traditional living room entertainment experience.         Read the CHI 2013 Best-Paper: IllumiRoom: Peripheral Projected Illusions for Interactive Experiences (7.6 MB .pdf) IllumiRoom uses a Kinect for Windows camera and a projector to blur the lines between on-screen content and the environment we live in allowing us to combine our virtual…


Established: October 2, 2010

LightSpace combines elements of surface computing and augmented reality research to create a highly interactive space where any surface, and even the space between surfaces, is fully interactive. Our concept transforms the ideas of surface computing into the new realm of spatial computing. Instrumented with multiple depth cameras and projectors, LightSpace is a small room installation designed to explore a variety of interactions and computational strategies related to interactive displays and the space that they…










Pen + Touch = New Tools
Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, Bill Buxton, in UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology, ACM, October 3, 2010, View abstract, Download PDF


Mouse 2.0: Multi-touch meets the mouse
Nicolas Villar, Shahram Izadi, Dan Rosenfeld, Hrvoje Benko, John Helmes, Jonathan Westhues, Steve Hodges, Eyal Ofek, Alex Butler, Xiang Cao, Billy Chen, in Proceedings of UIST 2009, ACM Symposium on User Interface Software and Technology. p. 33-42. UIST 2009 Best Paper Award., Association for Computing Machinery, Inc., October 1, 2009, View abstract, Download PDF










Conference Organization

  • Program Committee member for ACM SIGGRAPH 2015
  • Program Committee member for ACM CHI 2015
  • Demo Co-Chair for ACM ICMI 2015
  • General Chair ACM UIST 2014
  • Program Committee member for ACM CHI 2014
  • Program Committee member for ACM ISMAR 2013
  • Program Co-Chair for ACM UIST 2012
  • Program Committee member for ACM ISMAR 2012
  • Program Committee member for ACM CHI 2012
  • Program Committee member for ACM ITS 2011
  • Program Committee member for Video Showcase at ACM CHI 2011
  • Program Committee member for ACM UIST 2010
  • Program Committee member and Doctoral Symposium Co-Chair for ACM ITS 2010
  • Program Committee member for ACM ICMI/MLMI 2010
  • Program Committee member for IEEE 3DUI 2010
  • Webmaster for ACM UIST 2009
  • Program Committee Member for IEEE 3DUI 2009
  • Demo Co-Chair for ACM UIST 2008
  • Program Committee member and Publicity Chair for 2nd International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN ’08), January 8-10, 2008, Cancun, Mexico.
  • Publicity Co-Chair for The Sixth IEEE International Symposium on Mixed and Augmented Reality (ISMAR ’07), November 13-16, 2007, Nara, Japan.
  • Co-Chair for the First Annual Columbia Computer Science Student Research Symposium, December 8th 2006, Davis Auditorium, Columbia University, NY.
  • Student Volunteer Organizing Committee, IEEE and ACM ISAR 2001 (International Symposium on Augmented Reality). New York, NY. October 29-30, 2001.


  • Associate Editor of IEEE Computer Graphics and Applications
  • Guest Editor for the Special Issue of IEEE Computer Graphics and Applications on Interacting beyond the Screen. To be published in May/June 2014.
  • Guest Editor for the Special Issue of Computers and Graphics on Touching the 3rd Dimension. Published by Elsevier in Dec. 2012.

Peer Reviewer

  • ACM SIGCHI: 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015
  • ACM SIGGRAPH: 2003, 2008, 2009, 2010, 2011, 2012, 2014, 2015
  • ACM SIGGRAPH ASIA: 2012, 2013
  • ACM User Interface Software and Technology (UIST): 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013
  • ACM Interactive Tabletops and Surfaces (ITS): 2009, 2010, 2011, 2012, 2013
  • ToCHI Journal: 2009, 2012
  • IEEE’s Transactions on Visualization and Computer Graphics (TVGC): 2008, 2009, 2011
  • International Conference on Multimodal Interfaces (ICMI): 2010
  • IEEE Virtual Reality (VR): 2004, 2005, 2007
  • IEEE Symposium on 3D User Interfaces: 2007, 2008, 2009, 2010, 2011
  • IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR): 2003, 2004, 2005, 2010
  • IEEE International Symposium on Wearable Computing (ISWC): 2007
  • IEEE Visualization: 2003, 2006
  • Pervasive: 2005
  • SmartGraphics: 2004

Supervised Students & Interns

  • Tomislav Pejsa (Summer 2014) coadvised with E. Ofek, A. Wilson
  • Julian Kantor (Summer 2014) coadvised with E. Ofek, A. Wilson
  • Feng Zheng (Summer 2014) coadvised with E. Ofek, A. Wilson
  • Yan Wang (Spring 2014) coadvised with E. Ofek, A. Wilson
  • Jarrod Knibbe (Summer 2013)
  • Ravish Mehra (Summer 2013)
  • Brett Jones (Summer 2013) coadvised with E. Ofek, A. Wilson
  • Rajinder Sodhi (Summer 2013) coadvised with E. Ofek, A. Wilson
  • Michael Murdock (Summer 2013) coadvised with E. Ofek, A. Wilson
  • Felipe Bacim de Araujo e Silva (Summer 2012) coadvised with M. Sinclair
  • Brett Jones (Summer 2012)
  • Rajinder Sodhi (Summer 2011)
  • Chris Harisson (Spring 2011)
  • Roland Aigner (Fall 2010) coadvised with D. Wigdor
  • Ricardo Costa Jota (Summer 2010)
  • David Holman (Summer 2010)
  • Dustin Freeman (Spring 2009) coadvised with D. Wigdor
  • Miguel Nacenta (Summer 2008) coadvised with A. Wilson and P. Baudisch
  • Bjoern Hartmann (Summer 2008) coadvised with M. Morris and A. Wilson
  • Bhashinee Garg (Spring 2007) – Exploration of dual-sided multi-touch interactions on a handheld device
  • Ivor Baksa (Spring 2006) – Development of AR conversational agent
  • Shezan Baig (Spring 2004) – TabletPC AR system for archeological visualization
  • Erik Peterson (Spring 2004) – Interface develpment for situated AR multimedia
  • Shezan Baig (Fall 2003) – Adaptive meshing algorithm for large meshes
  • Sajid Sadi (Spring 2003) – P5 Glove gesture recognizer for SenseShapes
  • Zachariah Munoz (Fall 2002, Spring 2003) – Modeling of the Cathedral of St. John the Divine
  • Sajid Sadi (Fall 2002) – Control interface for SenseShapes

Student Volunteer

  • SIGCHI 2007
  • SIGMM 2004
  • ISAR 2001

Courses Taught

  • Introduction to Programming in Java – COMS W3101-03 (Spring 2005)

Teaching Assistant

  • Video Games Design and Technology – COMS W4995-01 (Spring 2003)
  • User Interface Design – COMS W4170 (Fall 2001)
  • Introduction to Computing in Java (Fall 1999)