{"id":1084560,"date":"2024-10-24T10:25:53","date_gmt":"2024-10-24T17:25:53","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-blog-post&#038;p=1084560"},"modified":"2024-11-20T10:15:47","modified_gmt":"2024-11-20T18:15:47","slug":"timeline-assistive-technology-at-microsoft-research","status":"publish","type":"msr-blog-post","link":"https:\/\/www.microsoft.com\/en-us\/research\/articles\/timeline-assistive-technology-at-microsoft-research\/","title":{"rendered":"Timeline: Assistive technology at Microsoft Research"},"content":{"rendered":"\n<p>Empowering every person and organization on the planet to achieve more requires giving ownership of the computing experience to the individual and leveraging technological advancements to deliver products, tools, and services that people can make work for them\u2014whatever their circumstances or abilities. Over the years, Microsoft Research has collaborated closely with people with disabilities and those who support them to thoughtfully and creatively innovate around this commitment to inclusive design and accessible technology. Below is a sampling of those efforts. To learn more, explore <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/aka.ms\/Find-My-Things\">researchers\u2019 experience developing the teachable AI tool Find My Things<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> or, for research in assistive technologies and beyond, explore <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/\">Microsoft Research<\/a>.<\/p>\n\n\n\t<div class=\"wp-block-msr-block-journey journey journey--date alignwide\" data-bi-aN=\"block-journey\">\n\t\t<ol class=\"journey__list\">\n\t\t\t\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2024\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tSep\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"find-my-things-recognized-for-innovative-design\">Find My Things recognized for innovative design<\/h3>\n\n\n\n<p>Find My Things, the object recognition tool that can be personalized from a few videos of an item and is available in the Seeing AI mobile app, is a finalist in the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.fastcompany.com\/91128700\/accessible-design-innovation-by-design-2024\" target=\"_blank\" rel=\"noopener noreferrer\">accessible design<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> and <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.fastcompany.com\/91129207\/artificial-intelligence-innovation-by-design-2024\" target=\"_blank\" rel=\"noopener noreferrer\">artificial intelligence<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> categories of the US-based business media brand Fast Company\u2019s Innovation by Design Awards. Find My Things was developed by members of the Microsoft Research <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/taix\/\">Teachable AI Experiences (Tai X)<\/a> team and a group of citizen designers and was integrated into Seeing AI earlier in the year.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--1\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/understanding-personalized-accessibility-through-teachable-ai-designing-and-evaluating-find-my-things-for-people-who-are-blind-or-low-vision\/\">Read ASSETS 2023 paper<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--2\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/find-my-things-personalized-accessibility-through-teachable-ai-for-people-who-are-blind-or-low-vision\/\">Read CHI 2024 paper<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--3\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.seeingai.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Get the Seeing AI app<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2023\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"asl-citizen-dataset-released\">Foundation models used to support blind community<\/h3>\n\n\n\n<p>As part of Microsoft Research\u2019s Accelerate Foundation Models Research (AFMR) initiative, a team from Waseda University is developing a system that will leverage vision and language foundation models to help people who are blind or have low vision with outdoor navigation.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--4\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/collaboration\/accelerating-foundation-models-research\/\">Explore AFMR<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2023\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tJul\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"asl-citizen-dataset-released\">ASL Citizen dataset released<\/h3>\n\n\n\n<p>Microsoft releases ASL Citizen, the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/asl-citizen\/\">first crowdsourced isolated sign language dataset<\/a>. Built in collaboration with members of the Deaf community, ASL Citizen aims to tackle the data shortage preventing the advancement of AI systems that support sign language users. It\u2019s a challenge that Microsoft researchers have been targeting for years, including with a <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/asl-sea-battle-gamifying-sign-language-data-collection\/\">sign language game for addressing inaccurate labels, a lack of real-world settings, and other issues<\/a> present in existing sign language datasets and an <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/the-fate-landscape-of-sign-language-ai-datasets-an-interdisciplinary-perspective\/\">exploration of fairness, accountability, transparency, and ethics considerations in collecting sign language data<\/a>.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--5\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/download\/details.aspx?id=105253\" target=\"_blank\" rel=\"noreferrer noopener\">ASL Citizen dataset<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-fill-github\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/github.com\/microsoft\/ASL-citizen-code\" target=\"_blank\" rel=\"noreferrer noopener\">Code and baselines<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2022\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tSep\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"peoplelens-inspires-find-my-things\">PeopleLens inspires Find My Things<\/h3>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/peoplelens-using-ai-to-support-social-interaction-between-children-who-are-blind-and-their-peers\/\">Students in the United Kingdom between the ages of 5 and 11 test the advanced research prototype PeopleLens<\/a>. The head-worn device leverages AI and spatialized audio to identify people in a room, helping users situate themselves in social scenarios and more confidently interact with those around them. During development, the PeopleLens research team identified the value personalization could add to such an experience. Given the complexity of social encounters, the team opted to examine personalization in a more straightforward application\u2014object recognition\u2014planting the seed for the personalizable object recognizer Find My Things.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--6\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/peoplelens\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2021\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tOct\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"orbit-dataset-and-benchmark-released-to-github\">Ludic Design for Accessibility prioritizes play<\/h3>\n\n\n\n<p>Researchers introduce Ludic Design for Accessibility (LDA), a multistep approach that prioritizes play and exploration in the design of assistive technologies. LDA was inspired by earlier <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/video-gaming-for-the-vision-impaired\/\">work that leveraged spatial audio technology to help make mainstream video games enjoyable for people with low vision<\/a>. The approach has informed Microsoft Research projects such as an <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/vstroll-an-audio-based-virtual-exploration-to-encourage-walking-among-people-with-vision-impairments\/\">audio-based app designed to encourage physical activity among people who are blind or have low vision<\/a> and a study of <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/teachers-perceptions-around-digital-games-for-children-in-low-resource-schools-for-the-blind\/\">how teachers in low-resource schools believe digital games can best serve students who are blind<\/a>. Recently, Microsoft Research India collaborated with the Srishti Manipal Institute of Art, Design and Technology to incorporate LDA into its curriculum, tasking students at the institute with designing play-based learning and skill-building experiences for children with disabilities.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"yt-consent-placeholder\" role=\"region\" aria-label=\"Video playback requires cookie consent\" data-video-id=\"OyuNdEZz9y8\" data-poster=\"https:\/\/img.youtube.com\/vi\/OyuNdEZz9y8\/maxresdefault.jpg\"><iframe title=\"Ludic Design for Accessibility\" width=\"500\" height=\"281\" frameborder=\"0\" allowfullscreen><\/iframe><div class=\"yt-consent-placeholder__overlay\"><button class=\"yt-consent-placeholder__play\"><span class=\"yt-consent-placeholder__label\">Video playback requires cookie consent<\/span><\/button><\/div><\/div>\n<\/div><\/figure>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--7\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/ludic-design-for-accessibility-in-the-global-south\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2021\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tOct\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"orbit-dataset-and-benchmark-released-to-github\">ORBIT dataset and benchmark released<\/h3>\n\n\n\n<p>Microsoft releases the ORBIT dataset and benchmark. The project <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/wheres-my-stuff-developing-ai-with-help-from-people-who-are-blind-or-low-vision-to-meet-their-needs\/\">invited members of the blind and low-vision community to contribute videos of personal items they interact with regularly<\/a>. These videos were used to build a training dataset that is more inclusive of the objects people who are blind or have low vision might use, such as guide canes, and more representative of the variation in quality of images and videos captured by people who are blind or have low vision\u2014common challenges in existing datasets.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--8\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/orbit-dataset\/\">Read the paper<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-fill-github\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/github.com\/microsoft\/ORBIT-Dataset\" target=\"_blank\" rel=\"noreferrer noopener\">Get the ORBIT dataset and code<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2021\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tOct\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"alt-text-experience-studied\">Alt-text experience studied<\/h3>\n\n\n\n<p>Researchers develop and test different interfaces for writing alt text from scratch and for providing feedback on AI-generated alt text within Microsoft PowerPoint, work that is influential in the alt-text authoring experience available in the presentation software. Prior work in alt text\u2014that is, the copy used by screen readers to describe visual content to people who are blind or low vision\u2014includes the study of <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/person-shoes-tree-is-the-person-naked-what-people-with-vision-impairments-want-in-image-descriptions\/\">people\u2019s experiences with alt text across different types of websites<\/a> and <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/understanding-blind-peoples-experiences-computer-generated-captions-social-media-images\/\">specifically when it comes to images on social media<\/a> and <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/toward-scalable-social-alt-text-conversational-crowdsourcing-tool-refining-vision-language-technology-blind\/\">improving the alt text experience in social media settings<\/a>.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--9\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/designing-tools-for-high-quality-alt-text-authoring\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2021\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tApr\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"title-tbd-1\">Teleworkers share perspectives during pandemic<\/h3>\n\n\n\n<p>The rapid and wide-scale adoption of remote work caused by the pandemic gives teleworkers with disabilities who were already participating in a study about work from home the opportunity to share how having their colleagues join them in working remotely affected their experience. This work, along with publications that <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/towards-accessible-remote-work-understanding-work-from-home-practices-of-neurodivergent-professionals\/\">examine the accessibility of remote work through the experience of neurodivergent professionals<\/a> and that <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/accessibility-barriers-conflicts-and-repairs-understanding-the-experience-of-professionals-with-disabilities-in-hybrid-meetings\/\">identify the accessibility barriers, opportunities, and conflicts specific to hybrid meetings<\/a>, plays a role in expanding how Microsoft understands the impact of remote and hybrid work on people with disabilities.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--10\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/understanding-the-telework-experience-of-people-with-disabilities-2\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2020\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tSep\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"microsoft-expressive-pixels-released\">Microsoft Expressive Pixels released<\/h3>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-1024x576.jpg\" alt=\"screenshot of Microsoft Expressive Pixels app home screen\" class=\"wp-image-735895\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-16x9.jpg 16w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-343x193.jpg 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/03\/EPXHome_HCI_multi_1400x788.jpg 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>Microsoft Expressive Pixels is made available for free in the Microsoft Store. The development of Expressive Pixels, a platform for the creation and LED display of animations and images, was <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/expressive-pixels-a-new-visual-communication-platform-to-support-creativity-accessibility-and-innovation\/\">grounded in the Microsoft Research Enable team\u2019s extensive work with people living with amyotrophic lateral sclerosis (ALS) and others in the ALS community<\/a> and a goal of extending communication abilities for people with varying levels of speech and mobility. The work drew on expertise from across Microsoft, including the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/group\/ability\/?msockid=3477f5594d38661123f8e1164caa6783\">Microsoft Research Ability team<\/a>.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--11\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/blogs.windows.com\/windowsexperience\/2020\/09\/03\/microsoft-expressive-pixels-a-platform-for-creativity-inclusion-and-innovation\/\" target=\"_blank\" rel=\"noreferrer noopener\">Read the article<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--12\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/welcome-to-microsoft-expressive-pixels\/\">Watch or listen to the video<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--13\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/apps.microsoft.com\/detail\/9mtc56w1rxqh?rtc=1&hl=en-us&gl=US\" target=\"_blank\" rel=\"noreferrer noopener\">Download Expressive Pixels<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2020\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tApr\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"researchers-work-to-make-vr-and-ar-accessible\">Researchers work to make VR and AR accessible<\/h3>\n\n\n\n<p>Researchers with the Microsoft Research Ability Group receive honorable mention at the 2020 ACM CHI Conference on Human Factors in Computing Systems for their haptic and auditory white cane for navigating large, complex virtual environments. The team created a scavenger hunt to test the cane\u2019s ability to give users a sense of the shapes of different virtual objects and different surface textures. The work is a follow-up to an <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/enabling-people-visual-impairments-navigate-virtual-reality-haptic-auditory-cane-simulation-2\/\">earlier haptic cane controller<\/a>&nbsp;and is among a line of research dedicated to making virtual and augmented reality more accessible, including to <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/two-in-one-a-design-space-for-mapping-unimanual-input-into-bimanual-interactions-in-vr-for-users-with-limited-movement\/\">people with varying levels of mobility<\/a> and who are <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/towards-sound-accessibility-in-virtual-reality\/\">deaf or have a hearing disability<\/a>.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--14\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/virtual-reality-without-vision-a-haptic-and-auditory-white-cane-to-navigate-complex-virtual-worlds\/\">Read the paper<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--15\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/youtu.be\/rIuJuRvnOEw\" target=\"_blank\" rel=\"noreferrer noopener\">Watch or listen to the video<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2020\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tFeb\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"seeingvr-toolkit-made-open-source\">SeeingVR toolkit made open source<\/h3>\n\n\n\n<p>SeeingVR, a set of tools for improving the virtual reality experience for people with low vision, is made open source. The tools include visual and audio augmentations, such as magnification and brightness lenses and text-to-speech functionality.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--16\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/seeingvr-a-set-of-tools-to-make-virtual-reality-more-accessible-to-people-with-low-vision-2\/\">Read the paper<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-fill-github\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/github.com\/microsoft\/SeeingVRtoolkit\" target=\"_blank\" rel=\"noreferrer noopener\">Get the toolkit<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2019\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tFeb\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"title-tbd-3\">Event explores sign language\u2013based systems<\/h3>\n\n\n\n<p>Microsoft Research hosts <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/event\/microsoft-ai-for-accessibility-sign-language-recognition-translation-workshop\/\">a two-day interdisciplinary workshop<\/a> to identify the opportunities and obstacles in the space of sign language recognition, generation, and translation and publishes key insights and findings from the workshop later in the year. The publication wins best paper at the International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS).<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--17\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/sign-language-recognition-generation-and-translation-an-interdisciplinary-perspective\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2019\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tJan\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"title-tbd-4\">Code Jumper tech transferred<\/h3>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"683\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/01\/Torino-NWC_0024-1920x1280-1024x683.jpg\" alt=\"Rico was part of a group of students at New College Worcester in Worcester, UK, who participated in a beta test of the technology behind Code Jumper. Photo by Jonathan Banks.\" class=\"wp-image-561747\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/01\/Torino-NWC_0024-1920x1280-1024x683.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/01\/Torino-NWC_0024-1920x1280-300x200.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/01\/Torino-NWC_0024-1920x1280-768x512.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/01\/Torino-NWC_0024-1920x1280.jpg 1920w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Rico was part of a group of students at New College Worcester in Worcester, UK, who participated in a beta test of the technology behind Code Jumper. Photo by Jonathan Banks for Microsoft.<\/figcaption><\/figure>\n\n\n\n<p>Microsoft transfers the research and technology behind Code Jumper, a physical programming language developed as part of Project Torino, to the nonprofit American Printing House for the Blind for broad distribution.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--18\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/news.microsoft.com\/source\/features\/innovation\/project-torino-code-jumper\" target=\"_blank\" rel=\"noreferrer noopener\">Read the article<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--19\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/project-torino\/\">Learn more about Torino<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2018\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tMar\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"title-tbd-6\">Hands-Free Music project receives SXSW award<\/h3>\n\n\n\n<p>The <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/microsoft-hands-free-music\/\">Microsoft Hands-Free Music<\/a> project wins the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.sxsw.com\/interactive\/2018\/announcing-2018-winners-interactive-innovation-awards\/\" target=\"_blank\" rel=\"noopener noreferrer\">SXSW Interactive Innovation Award for music and audio innovation<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>; the project is also a <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.sxsw.com\/news\/2018\/announcing-2018-interactive-innovation-awards-finalists\/\" target=\"_blank\" rel=\"noopener noreferrer\">finalist in the \u201cinnovation in connecting people\u201d category<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. Developed in collaboration with members of the ALS community, the Hands-Free Music project is a suite of music-making and expressive tools that can be controlled via eye gaze and other hands-free interaction techniques, such as adaptive controllers. In the years since the SXSW honor, the project team has developed an <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/cyclops-designing-an-eye-controlled-instrument-for-accessibility-and-flexible-use\/\">eye gaze-controlled synthesizer and sequencer for live play and improvisation<\/a> and the adaptive musical instrument Galactic Bell Star, among other work, and a group of wounded veterans used the project\u2019s technologies to write an anthem for the Invictus Games.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"yt-consent-placeholder\" role=\"region\" aria-label=\"Video playback requires cookie consent\" data-video-id=\"TQPGb1I21uE\" data-poster=\"https:\/\/img.youtube.com\/vi\/TQPGb1I21uE\/maxresdefault.jpg\"><iframe title=\"Microsoft Hands-Free Music\" width=\"500\" height=\"281\" frameborder=\"0\" allowfullscreen><\/iframe><div class=\"yt-consent-placeholder__overlay\"><button class=\"yt-consent-placeholder__play\"><span class=\"yt-consent-placeholder__label\">Video playback requires cookie consent<\/span><\/button><\/div><\/div>\n<\/div><\/figure>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--20\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/unlocked.microsoft.com\/invictus-anthem-for-all\/\" target=\"_blank\" rel=\"noreferrer noopener\">Learn about Anthem for All<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--21\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/galactic-bell-star-music-demo-expressive-pixels\/\">Watch or listen to Galactic Bell Star demo<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2017\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tOct\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"eye-control-introduced-in-windows-10\">Eye Control introduced in Windows 10<\/h3>\n\n\n\n<p><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/blogs.microsoft.com\/accessibility\/windows-10-accessibility-update\/\" target=\"_blank\" rel=\"noopener noreferrer\">Eye Control is introduced in Windows 10<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, enhancing the way people, including those with varying levels of mobility, use their PCs for work and play. Eye Control leverages gaze-tracking technology, and with an appropriate eye-tracking device, users can type, direct their mouse, and activate text-to-speech via eye movement. The feature is a result of a <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/news.microsoft.com\/source\/features\/innovation\/empathy-innovation-accessibility\/\" target=\"_blank\" rel=\"noopener noreferrer\">collaboration between the Microsoft Research Enable team and the Windows Text Input development team<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. The Enable team\u2019s earlier work outfitting a motorized wheelchair with eye-tracking technology laid the groundwork. Microsoft later leads a consortium of technology companies to create <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/blog.eyetechds.com\/eyetech-ds-microsoft-work-together-to-release-usb-hid-standard-for-eye-tracking\" target=\"_blank\" rel=\"noopener noreferrer\">the first USB HID (Human Interface Device) standard for eye tracking<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> and <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/blogs.windows.com\/windowsdeveloper\/2018\/05\/08\/how-to-create-accessible-apps-and-immersive-game-experiences-with-new-eye-tracking-apis\/\" target=\"_blank\" rel=\"noopener noreferrer\">releases a set of APIs behind Eye Control<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> for use by the developer community.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"yt-consent-placeholder\" role=\"region\" aria-label=\"Video playback requires cookie consent\" data-video-id=\"NXOzDf26FnU\" data-poster=\"https:\/\/img.youtube.com\/vi\/NXOzDf26FnU\/maxresdefault.jpg\"><iframe title=\"How Eye Control empowers people with disabilities (Audio Description)\" width=\"500\" height=\"281\" frameborder=\"0\" allowfullscreen><\/iframe><div class=\"yt-consent-placeholder__overlay\"><button class=\"yt-consent-placeholder__play\"><span class=\"yt-consent-placeholder__label\">Video playback requires cookie consent<\/span><\/button><\/div><\/div>\n<\/div><figcaption class=\"wp-element-caption\"><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/8zQN9Zn7I7U\" target=\"_blank\" rel=\"noopener noreferrer\">Version without audio description<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/figcaption><\/figure>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2017\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tJul\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"seeing-ai-released\">Seeing AI released<\/h3>\n\n\n\n<p>Over a year after Microsoft CEO Satya Nadella introduces a prototype called Seeing AI at Build 2016, the mobile app is broadly released for use on iOS. Leveraging computer vision, natural language processing, and other technologies, Seeing AI assists people who are blind or low vision by identifying people and things in their surroundings, including scanning and reading aloud short text and documents. Since its release, Seeing AI features have expanded to include <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/blogs.microsoft.com\/accessibility\/seeing-ai-new-features\/\" target=\"_blank\" rel=\"noopener noreferrer\">color and currency recognition<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> and, more recently, <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/blogs.microsoft.com\/accessibility\/seeing-ai-app-launches-on-android-including-new-and-updated-features-and-new-languages\/\" target=\"_blank\" rel=\"noopener noreferrer\">richer descriptions and question-answering powered by generative AI<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. Seeing AI, now also available on Android devices, has been recognized for its innovative design and impact, including by the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.fastcompany.com\/innovation-by-design\/2018\/category\/apps-and-games\" target=\"_blank\" rel=\"noopener noreferrer\">US-based business media brand Fast Company<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.gsma.com\/newsroom\/press-release\/gsma-announces-winners-2018-glomo-awards\/\" target=\"_blank\" rel=\"noopener noreferrer\">the mobile industry group GSMA<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, and the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/docs.fcc.gov\/public\/attachments\/DOC-351574A1.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Federal Communications Commission<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"yt-consent-placeholder\" role=\"region\" aria-label=\"Video playback requires cookie consent\" data-video-id=\"rVF2duPVUTY\" data-poster=\"https:\/\/img.youtube.com\/vi\/rVF2duPVUTY\/maxresdefault.jpg\"><iframe title=\"Satya Nadella introducing Seeing AI Prototype at Build 2016 conference\" width=\"500\" height=\"281\" frameborder=\"0\" allowfullscreen><\/iframe><div class=\"yt-consent-placeholder__overlay\"><button class=\"yt-consent-placeholder__play\"><span class=\"yt-consent-placeholder__label\">Video playback requires cookie consent<\/span><\/button><\/div><\/div>\n<\/div><\/figure>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--22\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.seeingai.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Get the Seeing AI app<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2014\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tNov\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"title-tbd-7\">Audio-based prototype for exploration launched<\/h3>\n\n\n\n<p>Microsoft, Guide Dogs, and Future Cities Catapult launch the Cities Unlocked prototype. Cities Unlocked, which would come to be known as <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/product\/soundscape\/\">Microsoft Soundscape<\/a>, uses 3D audio cues to \u201cunlock\u201d users\u2019 surroundings, identifying points of interest, street names and intersections, and the direction being traveled in real time to empower the exploration of new places by members of the blind and low-vision community. In 2018, <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/blogs.microsoft.com\/accessibility\/soundscape\/\">Soundscape became available on iOS<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> and <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/microsoft-soundscape-new-horizons-with-a-community-driven-approach\/\">was later made open source for broader development<\/a>, including by <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.scottishtecharmy.org\/soundscape\" target=\"_blank\" rel=\"noopener noreferrer\">the tech-for-good organization Scottish Tech Army<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--23\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/ukstories.microsoft.com\/2014\/11\/05\/unlockingcitiesthrou\/?_gl=1*5nqujl*_ga*NTIxODI5MTgyLjE3MTY1MDA4NDE.*_ga_JN5MSL685T*MTcyNTcyNTEwMi40MC4xLjE3MjU3MjUxMzIuMzAuMC4w\" target=\"_blank\" rel=\"noreferrer noopener\">Read the article<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\n\t<li class=\"wp-block-msr-block-moment moment has-date\" data-bi-aN=\"block-moment\">\n\t\t<div class=\"moment__dot moment__dot--start\" role=\"presentation\"><\/div>\n\t\t<div role=\"presentation\"><\/div>\n\t\t<div class=\"moment__details\">\n\t\t\t\t\t\t<div class=\"moment__counter\"><\/div>\n\t\t\t\t\t\t\t<div class=\"moment__date-year\">\n\t\t\t\t\t2014\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t<div class=\"moment__date-month\">\n\t\t\t\t\tAug\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t<div class=\"moment__content\">\n\t\t\t\n\n<h3 class=\"wp-block-heading moment__title\" id=\"title-tbd-8\">Eye Gaze Wheelchair wins company hackathon<\/h3>\n\n\n\n<p>A motorized wheelchair controllable by a user\u2019s eye movements takes top prize at the first Microsoft companywide hackathon. The work led to the creation of the Enable Group to further develop the wheelchair and other assistive technologies for those with neuromotor disabilities. The motivation behind the \u201cEye Gaze Wheelchair\u201d was a <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/blogs.microsoft.com\/accessibility\/from-hack-to-product-microsoft-empowers-people-with-eye-control-for-windows-10\/\" target=\"_blank\" rel=\"noopener noreferrer\">call to action from former NFL football player Steve Gleason<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, who had been diagnosed with ALS a few years earlier and uses a wheelchair to get around. Gleason was seeking tech help to navigate more independently and interact with his family more easily. He was an integral part of the development of the chair, and his foundation leveraged the technology behind it to help bring <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.numotion.com\/about-us\/news\/numotion-joins-former-nfl-player-steve-gleason-and\" target=\"_blank\" rel=\"noopener noreferrer\">eye-drive capability to market in 2019<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"606\" height=\"388\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/10\/hackathon-eye-gaze-wheelchair_cropped.png\" alt=\"a group of people sitting in chairs\" class=\"wp-image-1095225\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/10\/hackathon-eye-gaze-wheelchair_cropped.png 606w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/10\/hackathon-eye-gaze-wheelchair_cropped-300x192.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/10\/hackathon-eye-gaze-wheelchair_cropped-240x154.png 240w\" sizes=\"auto, (max-width: 606px) 100vw, 606px\" \/><\/figure>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--24\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/news.microsoft.com\/features\/one-year-later-hackathon-project-and-msr-enable-team-push-limits-on-technology-that-empowers\/\" target=\"_blank\" rel=\"noreferrer noopener\">Read the story<\/a><\/div>\n<\/div>\n\n\t\t<\/div>\n\t\t<div class=\"moment__dot moment__dot--end\" role=\"presentation\"><\/div>\n\t<\/li>\n\t\n\t\t<\/ol>\n\t<\/div>\n\t\n\n\n<p><strong><em>Timeline contributors:<\/em><\/strong><em> Mary Bellard, Neeltje Berger, Danielle Bragg, David Celis Garcia, Matt Corwine, Ed Cutrell, Kristina Dodge, Martin Grayson, Alyssa Hughes, Daniela Massiceti, Amanda Melfi, Ann Paradiso, Brenda Potts, Carly Quill, Katie Recken, John Tang, Patti Thibodeau, Amber Tingle, Saqib Shaikh, Manohar Swaminathan, Sarah Wang, Larry West, and Katie Zoller. Code Jumper photo by Jonathan Banks for Microsoft.<\/em>\u00a0<\/p>\n\n\n\n<div style=\"height:20px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n","protected":false},"excerpt":{"rendered":"<p>For many years, Microsoft Research has empowered people with disabilities by creating leading-edge assistive technologies\u2014from wheelchairs they can control with their eyes to AI-based tools they can train to find lost personal items. Learn more.<\/p>\n","protected":false},"author":42735,"featured_media":1084518,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-content-parent":1084509,"msr_hide_image_in_river":null,"footnotes":""},"research-area":[13556,13554],"msr-locale":[268875],"msr-post-option":[],"class_list":["post-1084560","msr-blog-post","type-msr-blog-post","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_assoc_parent":{"id":1084509,"type":"story"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-blog-post\/1084560","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-blog-post"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-blog-post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/42735"}],"version-history":[{"count":53,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-blog-post\/1084560\/revisions"}],"predecessor-version":[{"id":1096656,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-blog-post\/1084560\/revisions\/1096656"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/1084518"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=1084560"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=1084560"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=1084560"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=1084560"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}