{"id":455112,"date":"2018-04-21T15:38:13","date_gmt":"2018-04-21T22:38:13","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&#038;p=455112"},"modified":"2020-04-03T16:48:54","modified_gmt":"2020-04-03T23:48:54","slug":"enabling-people-visual-impairments-navigate-virtual-reality-haptic-auditory-cane-simulation-2","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/enabling-people-visual-impairments-navigate-virtual-reality-haptic-auditory-cane-simulation-2\/","title":{"rendered":"Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation"},"content":{"rendered":"<div id=\"attachment_471141\" style=\"width: 1034px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-471141\" class=\"wp-image-471141 size-large\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/03\/Figure-6-canetroller-1024x264.png\" alt=\"\" width=\"1024\" height=\"264\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/03\/Figure-6-canetroller-1024x264.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/03\/Figure-6-canetroller-300x77.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/03\/Figure-6-canetroller-768x198.png 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><p id=\"caption-attachment-471141\" class=\"wp-caption-text\">(A) A blind user wearing the gear for our VR evaluation, including a VR headset and Canetroller, our haptic VR controller. (B) The mechanical elements of Canetroller. (C) Overlays of the virtual scene atop the real scene show how the virtual cane extends past the tip of the Canetroller device and can interact with the virtual trash bin. (D) The use of Canetroller to navigate a virtual street crossing: the inset shows the physical environment, while the rendered image shows the corresponding virtual scene. Note that users did not have any visual feedback when using our VR system. The renderings are shown here for clarity.<\/p><\/div>\n<p>Traditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. We created <i>Canetroller<\/i>, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world. Canetroller provides three types of feedback: (1) physical resistance generated by a wearable programmable brake mechanism that physically impedes the controller when the virtual cane comes in contact with a virtual object; (2) vibrotactile feedback that simulates the vibrations when a cane hits an object or touches and drags across various surfaces; and (3) spatial 3D auditory feedback simulating the sound of real-world cane interactions. We designed indoor and outdoor VR scenes to evaluate the effectiveness of our controller. Our study showed that Canetroller was a promising tool that enabled visually impaired participants to navigate different virtual spaces. We discuss potential applications supported by Canetroller ranging from entertainment to mobility training.<\/p>\n\t<iframe\n\t\tsrc=\"https:\/\/www.youtube.com\/embed\/Q1jHXxUBJ8o?rel=0\"\n\t\twidth=\"560\"\n\t\theight=\"315\"\n\t\taria-label=\"\"\n\t\tallowfullscreen=\"true\">\n\t<\/iframe>\n\t\n","protected":false},"excerpt":{"rendered":"<p>Traditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. We created Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world. Canetroller provides three types of feedback: [&hellip;]<\/p>\n","protected":false},"featured_media":433023,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_publishername":"ACM","msr_publisher_other":"","msr_booktitle":"","msr_chapter":"","msr_edition":"","msr_editors":"","msr_how_published":"","msr_isbn":"","msr_issue":"","msr_journal":"","msr_number":"","msr_organization":"","msr_pages_string":"","msr_page_range_start":"","msr_page_range_end":"","msr_series":"","msr_volume":"","msr_copyright":"","msr_conference_name":"2018 ACM Conference on Human Factors in Computing Systems (CHI)","msr_doi":"","msr_arxiv_id":"","msr_s2_paper_id":"","msr_mag_id":"","msr_pubmed_id":"","msr_other_authors":"","msr_other_contributors":"","msr_speaker":"","msr_award":"","msr_affiliation":"","msr_institution":"","msr_host":"","msr_version":"","msr_duration":"","msr_original_fields_of_study":"","msr_release_tracker_id":"","msr_s2_match_type":"","msr_citation_count_updated":"","msr_published_date":"2018-4-21","msr_highlight_text":"","msr_notes":"","msr_longbiography":"","msr_publicationurl":"","msr_external_url":"","msr_secondary_video_url":"","msr_conference_url":"","msr_journal_url":"","msr_s2_pdf_url":"","msr_year":0,"msr_citation_count":0,"msr_influential_citations":0,"msr_reference_count":0,"msr_s2_match_confidence":0,"msr_microsoftintellectualproperty":true,"msr_s2_open_access":false,"msr_s2_author_ids":[],"msr_pub_ids":[],"msr_hide_image_in_river":0,"footnotes":""},"msr-research-highlight":[],"research-area":[13554,13553],"msr-publication-type":[193716],"msr-publisher":[],"msr-focus-area":[],"msr-locale":[268875],"msr-post-option":[],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-455112","msr-research-item","type-msr-research-item","status-publish","has-post-thumbnail","hentry","msr-research-area-human-computer-interaction","msr-research-area-medical-health-genomics","msr-locale-en_us"],"msr_publishername":"ACM","msr_edition":"","msr_affiliation":"","msr_published_date":"2018-4-21","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"455115","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"file","viewUrl":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/01\/canetroller.pdf","id":"455115","title":"canetroller","label_id":"243109","label":0}],"msr_related_uploader":"","msr_citation_count":0,"msr_citation_count_updated":"","msr_s2_paper_id":"","msr_influential_citations":0,"msr_reference_count":0,"msr_arxiv_id":"","msr_s2_author_ids":[],"msr_s2_open_access":false,"msr_s2_pdf_url":null,"msr_attachments":[],"msr-author-ordering":[{"type":"text","value":"Yuhang Zhao","user_id":0,"rest_url":false},{"type":"text","value":"Cynthia Bennett","user_id":0,"rest_url":false},{"type":"text","value":"Hrvoje Benko","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Ed Cutrell","user_id":31490,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Ed Cutrell"},{"type":"user_nicename","value":"Christian Holz","user_id":31415,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Christian Holz"},{"type":"user_nicename","value":"Meredith Ringel Morris","user_id":32884,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Meredith Ringel Morris"},{"type":"user_nicename","value":"Mike Sinclair","user_id":33669,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Mike Sinclair"}],"msr_impact_theme":[],"msr_research_lab":[199565],"msr_event":[472329],"msr_group":[283244,371909,379814,396845],"msr_project":[432870,638733,312002],"publication":[],"video":[],"msr-tool":[],"msr_publication_type":"inproceedings","related_content":{"projects":[{"ID":432870,"post_title":"Haptic Controllers &amp; Rich Haptics for MR","post_name":"haptic-controllers","post_type":"msr-project","post_date":"2018-03-08 08:00:35","post_modified":"2020-12-23 06:58:19","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/haptic-controllers\/","post_excerpt":"We have been exploring a number of ways in which technology can generate a wide range of haptic sensations that may fit within held Virtual Reality controllers, not unlike the ones currently being used by consumers. Enabling users to touch and grasp virtual objects, feel the sliding of their fingertips on the surface of the objects and more. The ultimate goal: Allowing users to interact with the virtual digital world, in more natural ways than&hellip;","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/432870"}]}},{"ID":638733,"post_title":"Accessible Mixed Reality","post_name":"accessible-mixed-realityanast","post_type":"msr-project","post_date":"2020-02-22 17:15:43","post_modified":"2021-07-07 11:23:07","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/accessible-mixed-realityanast\/","post_excerpt":"This project considers how to design emerging Mixed Reality technologies (including AR and VR) so that they are usable by and useful to people of all abilities. Too often, the accessibility of technology to people with disabilities is an afterthought (if it is considered at all); post-hoc or third-party patches to accessibility, while better than no solution, are less optimal than interface designs that consider ability-based concerns from the start. Mixed Reality (MR) technologies are&hellip;","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/638733"}]}},{"ID":312002,"post_title":"Rich Haptic Feedback in Virtual Reality","post_name":"rich-haptic-feedback-in-virtual-reality","post_type":"msr-project","post_date":"2016-10-27 13:53:37","post_modified":"2020-12-01 18:11:13","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/rich-haptic-feedback-in-virtual-reality\/","post_excerpt":"Publications Lung-Pan Chang, Eyal Ofek, Christian Holz, Hrvoje Benko &amp; Andrew D. Wilson Sparse Haptic Proxy: Touch Feedback in Virtual Environments Using a General Passive Prop In proceedings of ACM CHI 2017 PDF \u00a0 Video &nbsp; Hrvoje Benko, Christian Holz, Mike Sinclair, and Eyal Ofek NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers. In proceedings of ACM UIST 2016 PDF \u00a0 Video &nbsp; Mahdi Azmandian, Mark Hancock, Hrvoje Benko, Eyal&hellip;","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/312002"}]}}]},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/455112","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":4,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/455112\/revisions"}],"predecessor-version":[{"id":638739,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/455112\/revisions\/638739"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/433023"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=455112"}],"wp:term":[{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=455112"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=455112"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=455112"},{"taxonomy":"msr-publisher","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publisher?post=455112"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=455112"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=455112"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=455112"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=455112"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=455112"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=455112"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=455112"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=455112"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}