Featured AI for Accessibility projects
Check out how our grantees are using AI powered technology to make the world more inclusive.
Zyrobotics helps make STEM accessible to all children, with tools to learn coding and to become confident and proficient in a digital world.Download Zyrobotics overview
Increasing reading fluency with AI
Zyrobotics uses Microsoft AI to develop ReadAble Storiez, a STEM-based reading fluency program for students with disabilities and diverse learning needs.
Counting Zoo, an immersive eReader, uses Azure Speech Services to convert speech to onscreen text and incorporates narration, visuals, and interactive play to engage users.
Chatbot to enable support for people with disabilities
The Open University is creating a chatbot to support people with disabilities. ADMINS aims to solve the barriers to independent living caused by forms and processes.
Gamified speech therapy
Speech therapy can be monotonous for kids, and therapists often have little data on how well kids are performing their therapy. Verboso is bridging that gap by creating video games controlled by speech therapy exercises which motivate students and provide real-time feedback for clinicians.
Making people aware of their surroundings
UC Berkeley is building a mobile app for users who are blind or have low vision that uses the device’s sensors and cameras to provide captions and audio descriptions of their surroundings.
Understanding non-standard speech patterns
Voiceitt is building automatic speech recognition technology designed to understand non-standard speech patterns, to provide individuals with speech disabilities an enhanced real-time communication platform.
Improving communication for ALS and MS patients
Pison Technology has developed a patented wrist-wearable neuromuscular sensing system that offers hands-free, microgesture-based control of digital platforms, and will improve communication for individuals with neuromuscular disabilities such as ALS and MS.
ACT Lab enables communication with tools for storytelling from photosLearn about the storytelling app
Visualize the rhythm of music with beat trackingLearn more about BeatCaps
Helping jobseekers find their career paths
Leonard Cheshire’s tool will use Azure AI to help jobseekers with disabilities explore a more person-centered career path. A person will complete an assessment to outline their skills and career goals, and career paths will be recommended based on the jobseeker’s results.
Expanding inclusive hiring
Our Ability is furthering inclusive hiring in the manufacturing and scientific research field, by offering employment seekers with cognitive disabilities an accessible and intuitive AI-powered chatbot to help them prepare for job interviews.
VR job interview training for people with Autism
The Frist Center for Autism and Innovation at Vanderbilt University is developing virtual reality-based systems that provide people with autism job interview training through meaningful multimodal interactions.
MSFT Ability Initiative at University of Texas at Austin
University of Texas partners with Microsoft Research and AI for Accessibility to collect and use an expansive labelled dataset to improve the accuracy of automatic image descriptions for people who are blind or have low vision.
Communication support for people with locked-in syndrome
Tokyo Institute of Technology is developing a communication system for people with Locked-in Syndrome and other related disabilities. The interface compares the characteristics of the visual target and the pupillary responses to interact with the app.
Personalizing object recognition through AI
Through project ORBIT (Object Recognition for Blind Image Training), City University of London is training AI systems for personalizing object recognition. They are also developing an AI curriculum to educate young people who are blind or have low vision.
Personalized and assistive navigation for all pedestrians
Existing navigation apps primarily are for cars, not pedestrians. iMerciv is building MapinHood, an app for pedestrian travel. Using machine learning and inclusive crowd sourcing, they are working to provide the personalized assistive navigation for all pedestrians.
Improving braille literacy skills via gamification
Using speech recognition with a braille display, Braille AI Tutor helps students practice and improve their braille literacy skills via gamification, even when a teacher is not present.
Helping people living with epilepsy
Researchers at the University of Sydney are developing an intelligent and real-time brain signal processing system for people living with epilepsy. The smart seizure advisory system delivers a timely warning about the likelihood of epileptic seizure strike.
Helping people navigate around town
Mass Eye and Ear, a Harvard Medical School affiliate, is advancing navigation services for people who are blind or have ultra-low vision. While many services specify the vicinity of a bus stop, their SuperVision Search app guides users to exact bus stop locations in 12 cities.
Exercise app for people who have low vision
Researchers at the University of Iowa are developing an intelligent application for people who are blind or with low vision to independently walk around a 400-meter track. The app can determine if someone is veering from their lane and delivers real-time feedback to help them stay on track.