{"id":647676,"date":"2020-04-07T10:18:49","date_gmt":"2020-04-07T17:18:49","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=647676"},"modified":"2020-04-23T15:50:15","modified_gmt":"2020-04-23T22:50:15","slug":"bringing-virtual-reality-to-people-who-are-blind-with-an-immersive-sensory-based-system","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/bringing-virtual-reality-to-people-who-are-blind-with-an-immersive-sensory-based-system\/","title":{"rendered":"Bringing virtual reality to people who are blind with an immersive sensory-based system"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-647694 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1.png\" alt=\"A man uses a VR white cane controller in an empty parking garage. Two small images in the upper right show a rendered overhead view of a room and a virtual white cane pointing at a yellow cube-shaped virtual object. \" width=\"1400\" height=\"788\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1.png 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-300x169.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-1024x576.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-768x432.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-1066x600.png 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-655x368.png 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-343x193.png 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-640x360.png 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-960x540.png 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-1-1280x720.png 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/p>\n<p>Virtual reality (VR) is an incredibly exciting way to experience computing, providing users with intuitive and immersive means of interacting with information that attempts to mirror the way we naturally experience the world around us. In the past few years, powerful VR systems have dropped in price and are on the verge of becoming mainstream technologies with potential uses in all kinds of applications. However, most VR technologies focus on rendering realistic visual effects. In fact, the hallmark of most VR systems is the head-mounted display that completely dominates a user\u2019s visual field. But what happens if a VR user is blind? Does that mean that they are completely shut out of virtual experiences?<\/p>\n<p>In this project, published in the paper <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/virtual-reality-without-vision-a-haptic-and-auditory-white-cane-to-navigate-complex-virtual-worlds\/\">\u201cVirtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds,\u201d<\/a> we investigated a new controller that mimics the experience of using a white cane to enable a user who is blind to explore large virtual environments in the same way they navigate the real world\u2014by using their senses of touch and hearing. Our paper has been accepted at the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/chi2020.acm.org\/\">ACM CHI Conference on Human Factors in Computing Systems (CHI 2020)<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> and received an <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/chi2020.acm.org\/for-attendees\/chi-2020-best-papers-honourable-mentions\/\">Honourable Mention Award<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.<\/p>\n<h3>Making a white cane for the virtual world with users\u2019 needs in mind<\/h3>\n<p>White cane users undergo intensive training to effectively use their canes to navigate and explore the world. They learn to hold the cane differently for different situations, listen to the sounds generated as the cane taps or sweeps along the ground and obstacles, and feel subtle changes in vibrations as the cane encounters different materials. By combining this experience with their other senses (sound, smell, and touch), they can use their cane to effectively navigate their environment.<\/p>\n<p>In 2018, we introduced the concept of a haptic white cane controller for VR in a <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/enabling-people-visual-impairments-navigate-virtual-reality-haptic-auditory-cane-simulation-2\/\">paper<\/a> that demonstrated how users who are blind could utilize their skills with a white cane to explore a small virtual space. This year, we expand on this work to make the controller more natural, allowing for immersive navigation of large, complex environments comprising multiple rooms. The controller is mounted to a harness that users wear around their waist, and they can then hold the controller like they would an ordinary white cane. This allows them to use the mobility and orientation skills that they\u2019ve learned for the real world to navigate a virtual world, using the virtual cane to detect walls, doors, obstacles, and changes in surface textures. See Figure 1 below for a detailed breakdown of the controller\u2019s components.<\/p>\n<div id=\"attachment_647679\" style=\"width: 1359px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-647679\" class=\"wp-image-647679 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Figure-1.png\" alt=\"A woman wearing the white cane VR controller. It comprises headphones, a support harness, and the controller with the various sensors, brakes, and actuators. An inset illustrates three different styles of grips supported by the controller. \" width=\"1349\" height=\"715\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Figure-1.png 1349w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Figure-1-300x159.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Figure-1-1024x543.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Figure-1-768x407.png 768w\" sizes=\"auto, (max-width: 1349px) 100vw, 1349px\" \/><p id=\"caption-attachment-647679\" class=\"wp-caption-text\">Figure 1: Left: Components of our navigation cane controller. The controller renders force feedback in three orthogonal axes of motion, tactile feedback through a voice coil actuator, and spatialized audio effects through stereo headphones. 6-DOF trackers on the headphones and cane localize the user in virtual space, and the belt fastens our controller to the body. Upper right: People who are blind use different white cane grip styles based on need and preference. Our controller accommodates various styles. A) Traditional cane grip centered-high, B) pencil cane grip centered-low, and C) standard cane grip centered-low.<\/p><\/div>\n<h3>Putting the pieces together: How the controller emulates a real-world environment<\/h3>\n<p>Our controller uses a lightweight, three-axis brake mechanism (controlling dimensions of movement side-to-side, up-down, and forward-backward) to provide users the general shape of virtual objects. Each of the braking mechanisms has a unique construction that enables it to address different needs (see our paper for in-depth explanations of each of these). In Figure 2, we show how one of these brakes operates using a coiled cord to provide tension, and further details of how the brake utilizes friction using a <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/www.ahdictionary.com\/word\/search.html?q=capstan\">capstan<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> can be found in Figure 3. The flexibility of the three-axis system enables people to adapt the controller to different grips, depending on the context of use.<\/p>\n<div id=\"attachment_647682\" style=\"width: 1169px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-647682\" class=\"wp-image-647682 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-white-cane-fig-2.jpg\" alt=\"A coiled braking mechanism, with an arrow showing that it rotates from side to side. \" width=\"1159\" height=\"590\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-white-cane-fig-2.jpg 1159w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-white-cane-fig-2-300x153.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-white-cane-fig-2-1024x521.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-white-cane-fig-2-768x391.jpg 768w\" sizes=\"auto, (max-width: 1159px) 100vw, 1159px\" \/><p id=\"caption-attachment-647682\" class=\"wp-caption-text\">Figure 2: Horizontal axis brake. One of three different braking mechanisms used in the device. The arrow shows the axis of motion. This mechanism consists of a capstan with a helixwound cord that, when either side of the cord is tensioned, can render high output forces bi-directionally.<\/p><\/div>\n<div id=\"attachment_647685\" style=\"width: 953px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-647685\" class=\"wp-image-647685 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-3.png\" alt=\"Schematic illustration of the capstan brake mechanism, including the solenoid actuators and cord wound around capstan.\" width=\"943\" height=\"595\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-3.png 943w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-3-300x189.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-3-768x485.png 768w\" sizes=\"auto, (max-width: 943px) 100vw, 943px\" \/><p id=\"caption-attachment-647685\" class=\"wp-caption-text\">Figure 3: Capstan brake mechanism in the horizontal axis brake. Two solenoid-actuated \u201cshoes\u201d (upper left and lower right) exert a small friction to the cord against the capstan to prevent rotation.<\/p><\/div>\n<p>In addition to braking of the cane movement when it collides with a virtual object, we mounted a multifrequency vibrator to the controller to mimic the high frequencies felt when the cane rubs against different textures. The controller feels and sounds differently depending on the texture of the surface the virtual cane encounters. When you drag a cane across concrete, it sounds and feels very different from when you drag it across a wood floor or carpeting, and the controller mimics this experience. Finally, we provide 3D audio that is based on the geometry of the environment using <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/project-triton\/\">Project Triton<\/a>, technology developed at Microsoft Research. With this capability, a radio playing around the corner in another room sounds as if it\u2019s coming from that location and traveling around a corner.<\/p>\n<p>Putting all these components together, our controller allowed users who are blind to effectively explore a complicated virtual world of 6 meters by 6 meters to play a scavenger hunt game, locating targets and avoiding obstacles and traps. In user testing, we found that seven out of eight users were able to play the game, successfully navigating to locate targets while avoiding collisions with walls and obstacles (see Figure 4 for details).<\/p>\n<div id=\"attachment_647688\" style=\"width: 1440px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-647688\" class=\"wp-image-647688 size-full\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-4.jpg\" alt=\"A person uses the white cane controller to navigate a virtual environment. The virtual environment contains objects that can be detected by the controller. \" width=\"1430\" height=\"488\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-4.jpg 1430w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-4-300x102.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-4-1024x349.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/VR-White-Cane-Fig-4-768x262.jpg 768w\" sizes=\"auto, (max-width: 1430px) 100vw, 1430px\" \/><p id=\"caption-attachment-647688\" class=\"wp-caption-text\">Figure 4: A) A participant navigates through the experimental game using the prototype haptic controller. B) A rendered first-person view of the virtual environment. C) Overhead map of the virtual environment with participant and cane (represented by a blue sphere with line).<\/p><\/div>\n<p>Creating a white cane for the virtual space does come with challenges that we did not anticipate. For example, there are several types of white canes commonly used by people who are blind. These vary in weight, stiffness, and the kinds of tips they use. Some people prefer nylon or roller tips that easily glide over the ground, while others prefer the enhanced sensitivity of a metal tip. In our research, the sounds and feel of our controller were based on a carbon fiber cane with a metal tip. Users who were accustomed to the feedback from a metal-tip cane in the real world could easily identify the experiences they were having in VR. However, people who used a nylon- or roller-tip cane had a harder time identifying VR objects and surfaces because the feeling and sounds were very different to what they were used to. In future work, we would like to provide users the ability to change the virtual tip and cane materials to match what they typically use in the real world.<\/p>\n<p>Overall, we found that by using our system, we could provide users who are blind with a compelling VR experience through multimodal haptic and audio feedback. Our prototype system suggests that VR doesn\u2019t have to be limited only to those who have certain capabilities. To be clear, our prototype controller is still a long ways off from being a commercial product and there are many obstacles that we must overcome before something like this would be ready for commercialization. However, as VR becomes more common, it is critical that we try to include as many people as possible in our designs. This project shows one way that we can make this a reality.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Virtual reality (VR) is an incredibly exciting way to experience computing, providing users with intuitive and immersive means of interacting with information that attempts to mirror the way we naturally experience the world around us. In the past few years, powerful VR systems have dropped in price and are on the verge of becoming mainstream [&hellip;]<\/p>\n","protected":false},"author":38838,"featured_media":647715,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":[{"type":"user_nicename","value":"Ed Cutrell","user_id":"31490"},{"type":"user_nicename","value":"Eyal Ofek","user_id":"31772"}],"msr_hide_image_in_river":0,"footnotes":""},"categories":[1],"tags":[],"research-area":[13554],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-647676","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research-blog","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[199565],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[],"related-projects":[546345],"related-events":[641571],"related-researchers":[{"type":"user_nicename","value":"Ed Cutrell","user_id":31490,"display_name":"Ed Cutrell","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/cutrell\/\" aria-label=\"Visit the profile page for Ed Cutrell\">Ed Cutrell<\/a>","is_active":false,"last_first":"Cutrell, Ed","people_section":0,"alias":"cutrell"}],"msr_type":"Post","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-960x540.png\" class=\"img-object-cover\" alt=\"\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-960x540.png 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-300x169.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-1024x576.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-768x432.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-1066x600.png 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-655x368.png 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-343x193.png 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-640x360.png 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2-1280x720.png 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/04\/MSR_VRwithoutVision1400x788-2.png 1400w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","byline":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/cutrell\/\" title=\"Go to researcher profile for Ed Cutrell\" aria-label=\"Go to researcher profile for Ed Cutrell\" data-bi-type=\"byline author\" data-bi-cN=\"Ed Cutrell\">Ed Cutrell<\/a> and Eyal Ofek","formattedDate":"April 7, 2020","formattedExcerpt":"Virtual reality (VR) is an incredibly exciting way to experience computing, providing users with intuitive and immersive means of interacting with information that attempts to mirror the way we naturally experience the world around us. In the past few years, powerful VR systems have dropped&hellip;","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/647676","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/38838"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=647676"}],"version-history":[{"count":9,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/647676\/revisions"}],"predecessor-version":[{"id":648306,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/647676\/revisions\/648306"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/647715"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=647676"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=647676"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=647676"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=647676"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=647676"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=647676"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=647676"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=647676"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=647676"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=647676"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=647676"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}