Microsoft Research Blog

The Microsoft Research blog provides in-depth views and perspectives from our researchers, scientists and engineers, plus information about noteworthy events and conferences, scholarships, and fellowships designed for academic and scientific communities.

Preview of ACM’s Multimedia Conference Oct 29 keynote address

October 26, 2015 | Posted by Microsoft Research Blog


Zhengyou Zhang, research manager and principal researcher at Microsoft Research, will present his team’s latest advances in immersive human-human telecommunications at ACM’s annual multimedia conference in Brisbane, Australia. The 2015 ACM Multimedia Conference runs from October 26-30. View the full conference program.

In an October 29 keynote address titled “Vision-enhanced Immersive Interaction and Remote Collaboration with Large Touch Displays,” Dr. Zhang will demonstrate how the Kinect-inspired technology enables remote teams to feel as if they were working together in the same room.

The importance and impact of such immersive experiences initially came to prominence in 2012 with the release of Dr. Zhang’s paper, Microsoft Kinect Sensor and Its Effect, published in the journal IEEE MultiMedia. It has since become one of the publication’s most downloaded papers, culminating in Dr. Zhang earning the journal’s 2015 IEEE Multimedia Best Department Article Award to add to his extensive list of honors.

Dr. Zhang, who leads the Multimedia, Interaction, and Experience (MIX) group at Microsoft Research, will give ACM conference attendees a close up-view of ViiBoard (Vision-enhanced Immersive Interaction with touch Board). The system—consisting of VTouch and ImmerseBoard features—enables “natural interaction and immersive remote collaboration with large touch displays by adding a commodity color plus depth sensor,” according to ACM conference notes.

  • VTouch uses an RGBD sensor such as Microsoft Kinect to understand where the user is, who the user is, and what the user is doing even before the user touches the display.
  • ImmerseBoard uses 3D processing of depth images, life-sized rendering, and novel visualizations to emulate writing side-by-side on either a physical whiteboard or mirror.

The net effect provides remote participants with “a quantitatively better ability to estimate their remote partners’ eye gaze direction, gesture direction, intention, and level of agreement.”

To date, only brief details of ViiBoard have been released, most notably from the following online videos:

More details about ViiBoard can be found in the following two recent conference papers:

ImmerseBoard’s form factor is described in conference notes as “suitable for practical and easy installation in homes and offices.” Public availability has yet to be announced.

—John Kaiser, Research News

For more computer science research news, visit