Microsoft Research Blog

The Microsoft Research blog provides in-depth views and perspectives from our researchers, scientists and engineers, plus information about noteworthy events and conferences, scholarships, and fellowships designed for academic and scientific communities.

Partnership yields key breakthroughs in VR’s “grand challenge”

February 6, 2017 | By Microsoft blog editor

By Noboru Sean Kuno, Research Program Manager, Microsoft Research Asia

The potential for virtual reality (VR) to upend industrial design, medicine, and other specialized fields has now vaulted the emerging field into the ranks of what the National Academy of Engineering calls its 14 grand challenges of the 21st century, an eclectic list of endeavors from preventing nuclear terror to securing cyberspace.

The importance of improving VR and 3D immersive communication has been a cornerstone of Microsoft’s long term investment in this technology space, resulting in multiple innovations from

Dr. Gene Cheung, associate professor, National Institute of Informatics, Japan

Dr. Gene Cheung, associate professor, National Institute of Informatics, Japan

Microsoft’s Kinect for Xbox 360 Sensor, Surface Hub and HoloLens to Windows Creator update.

Collaborating with partners

Realizing more immersive communication via 3D applications requires a quantum leap in the capture and exchange of 3D geometry that can only be achieved with an ongoing commitment to signal processing research. At the heart of this effort is our collaborative research (CORE) project with academic partners including Dr. Gene Cheung, associate professor at Japan’s National Institute of Informatics, who has been tackling this problem for years.

Breakthrough

Using depth-sensing devices such as the Kinect Sensor, researchers developed an algorithm to enable better noise reduction and restore missing details across images. Crucially, they discovered a method to utilize graph-signal smoothness prior to enhancing both natural images (see Fig. 1) and depth images.

<bFigure 1: example of original 4-bit image (left) and bit-depth enhanced image to 8 bits using our approach (right)

Figure 1: example of original 4-bit image (left) and bit-depth enhanced image to 8 bits using our approach (right)

Collaboration with Microsoft Research

Dinei Florencio, senior researcher at Microsoft Research, has been working alongside professor Cheung on research into “rate-constrained 3D surface estimation” and “precision enhancement of multiple 3D depth maps.”

“These two research lines are the most active in our recent collaboration,” Florenicio said. “As we make the needed progress toward immersive communication, I believe Gene’s research is bringing some fundamental contributions.”

Other key members of the project include Cha Zhang of Microsoft Research as well as Pengfei Wan, a former graduate student at Hong Kong University of Science and Technology.

Moving forward

Florencio and Chueng are now leading research into whether active light sensing can accurately detect informative bio-signals — such as pulse/respiratory rate and temperature changes on a face — to reveal stress and mood or indicate if subjects are lying. A key question of the research is whether active light sensing can be extended to reveal the same details for shaded or remote human subjects.

“The project is very interesting in that it tries to estimate bio-signals for more efficient face-to-face communications,” said Tao Mei, senior researcher at Microsoft Research. “The Principal Investigator (PI) proposed to use active imaging, which is entirely non-contact and noninvasive, to solve this problem with a novel idea by analyzing the constructed thermal and depth images in an indoor active image sensing system.”

Upon completion, Professor Cheung will make the research tool publicly available. I am looking forward to seeing continuous progress and achievements from this collaboration. We hope more researchers explore this area to expand the frontier of Virtual Reality technologies and realize Princess Leia’s holographic messaging in future.