Imagining artificial intelligence applications with people with visual disabilities using tactile ideation

ASSETS '17 |

Published by ACM

Publication

There has been a surge in artificial intelligence (AI) technologies co-opted by or designed for people with visual disabilities. Researchers and engineers have pushed technical boundaries in areas such as computer vision, natural language processing, location inference, and wearable computing. But what do people with visual disabilities imagine as their own technological future? To explore this question, we developed and carried out tactile ideation workshops with participants in the UK and India. Our participants generated a large and diverse set of ideas, most focusing on ways to meet needs related to social interaction. In some cases, this was a matter of recognizing people. In other cases, they wanted to be able to participate in social situations without foregrounding their disability. It was striking that this finding was consistent across UK and India despite substantial cultural and infrastructural differences. In this paper, we describe a new technique for working with people with visual disabilities to imagine new technologies that are tuned to their needs and aspirations. Based on our experience with these workshops, we provide a set of social dimensions to consider in the design of new AI technologies: social participation, social navigation, social maintenance, and social independence. We offer these social dimensions as a starting point to forefront users’ social needs and desires as a more deliberate consideration for assistive technology design.