{"id":987693,"date":"2023-12-04T06:00:37","date_gmt":"2023-12-04T14:00:37","guid":{"rendered":""},"modified":"2023-12-04T06:00:38","modified_gmt":"2023-12-04T14:00:38","slug":"tackling-sign-language-data-inequity","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/tackling-sign-language-data-inequity\/","title":{"rendered":"Tackling sign language data inequity"},"content":{"rendered":"\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1.png\" alt=\"Blue to green gradient. Two rows of hands: the top row signing ASL and the bottom row signing Data.\" class=\"wp-image-988077\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1.png 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-300x169.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-1024x576.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-768x432.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-1066x600.png 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-655x368.png 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-343x193.png 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-240x135.png 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-640x360.png 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-960x540.png 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-1280x720.png 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/figure>\n\n\n\n<p>Access to information is considered a human right by many global organizations and governments. But even though at least 71 countries mandate the provision of services in sign language, most information resources (like search engines or news sites) are presented in written language only. Sign languages are the primary means of communication for about 70 million d\/Deaf people worldwide, and are also used by hearing family members, friends, and colleagues.<\/p>\n\n\n\n<p>While over 300 sign languages are in use worldwide, American Sign Language (ASL) is the primary sign language used in the United States. For many deaf people, English and other written languages are actually secondary languages. Requiring signing deaf people to navigate information in a written language like English forces them to operate in a different, and potentially non-fluent language. Adapting text resources for sign language input and output introduces significant technical challenges. Automatically recognizing or translating sign language could help expand access, but AI development has been blocked by lack of high-quality data.<\/p>\n\n\n\n<p>To help make technical systems more accessible to people with disabilities, Danielle Bragg, a senior researcher at Microsoft Research, has been leading <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/data-driven-accessibility-systems\/overview\/\" target=\"_blank\" rel=\"noreferrer noopener\">efforts to build systems<\/a> that better support sign language. This blog post provides an update on their progress, with a focus on their recent paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/asl-citizen-a-community-sourced-dataset-for-advancing-isolated-sign-language-recognition\/\" target=\"_blank\" rel=\"noreferrer noopener\">ASL Citizen: A Community-Sourced Dataset for Advancing Isolated Sign Language Recognition,<\/a> which introduces the first crowdsourced sign language dataset. Advancing the state of the art in sign recognition, the project demonstrates that <strong>community-centered data curation<\/strong> is not only the right thing to do, but also advances machine learning.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"limitations-of-prior-datasets\">Limitations of prior datasets<\/h2>\n\n\n\n<p>ASL Citizen supports machine learning methods to overcome limitations of prior Isolated Sign Language Recognition (ISLR) datasets. Model development typically requires a large, high-quality training set (i.e. large vocabulary, minimal label noise, representation of diverse signers and environments). Lack of appropriate sign language data collected with consent has been a major limitation to development of real-world sign language systems.<\/p>\n\n\n\n<p>Prior sign language datasets have been collected in two main ways: 1) by scraping the internet for videos or 2) by inviting people to a lab for recording. While scraping can result in large collections, the videos are typically collected without consent from video creators, and scraping violates many websites\u2019 terms of service. On the other hand, lab collections typically come with written consent from participants, but they are generally small, limited by the human hours required to record participants and the small pool of potential contributors located nearby. Lab collections also fail to capture diverse real-world settings, and it is difficult to identify and label content in scraped videos. To enable real-world sign language AI, sign language datasets need to additionally capture real-world settings, include diverse people, and be accurately labelled.<\/p>\n\n\n\n\t<div class=\"border-bottom border-top border-gray-300 mt-5 mb-5 msr-promo text-center text-md-left alignwide\" data-bi-aN=\"promo\" data-bi-id=\"1160910\">\n\t\t\n\n\t\t<p class=\"msr-promo__label text-gray-800 text-center text-uppercase\">\n\t\t<span class=\"px-4 bg-white display-inline-block font-weight-semibold small\">video series<\/span>\n\t<\/p>\n\t\n\t<div class=\"row pt-3 pb-4 align-items-center\">\n\t\t\t\t\t\t<div class=\"msr-promo__media col-12 col-md-5\">\n\t\t\t\t<a class=\"bg-gray-300 display-block\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/story\/on-second-thought\/\" aria-label=\"On Second Thought\" data-bi-cN=\"On Second Thought\" target=\"_blank\">\n\t\t\t\t\t<img decoding=\"async\" class=\"w-100 display-block\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2026\/01\/MFST_feature_SecondThought_1400x788.jpg\" alt=\"On Second Thought with Sinead Bovell\" \/>\n\t\t\t\t<\/a>\n\t\t\t<\/div>\n\t\t\t\n\t\t\t<div class=\"msr-promo__content p-3 px-5 col-12 col-md\">\n\n\t\t\t\t\t\t\t\t\t<h2 class=\"h4\">On Second Thought<\/h2>\n\t\t\t\t\n\t\t\t\t\t\t\t\t<p id=\"on-second-thought\" class=\"large\">A video series with Sinead Bovell built around the questions everyone\u2019s asking about AI. With expert voices from across Microsoft, we break down the tension and promise of this rapidly changing technology, exploring what\u2019s evolving and what\u2019s possible.<\/p>\n\t\t\t\t\n\t\t\t\t\t\t\t\t<div class=\"wp-block-buttons justify-content-center justify-content-md-start\">\n\t\t\t\t\t<div class=\"wp-block-button\">\n\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/story\/on-second-thought\/\" aria-describedby=\"on-second-thought\" class=\"btn btn-brand glyph-append glyph-append-chevron-right\" data-bi-cN=\"On Second Thought\" target=\"_blank\">\n\t\t\t\t\t\t\tExplore the series\t\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<\/div><!--\/.msr-promo__content-->\n\t<\/div><!--\/.msr-promo__inner-wrap-->\n\t<\/div><!--\/.msr-promo-->\n\t\n\n\n<h2 class=\"wp-block-heading\" id=\"designing-the-asl-citizen-collection\">Designing the ASL Citizen collection<\/h2>\n\n\n\n<p>To overcome the limitations of past datasets, the research team designed a novel sign language crowdsourcing platform. The platform was web-based and enabled people who wanted to contribute to log in, engage in consent, and record videos. Web collection opened the project to a larger, more diverse audience, including anyone with internet access. It also enabled capturing everyday environments, which real-world systems need to learn to handle. By enabling people to contribute wherever and whenever they want, but still providing an explicit consent process, crowdsourcing enables scale with consent.<\/p>\n\n\n\n<p>The platform design also solved labeling problems. Labeling challenges have limited past dataset size due to the large amount of time required to identify video contents. For example, for a dataset of sign language monologues, each video must be carefully watched, and the signed contents must be identified, marked down, and time-aligned with the video. Only highly skilled experts can annotate sign language videos, and the process often takes more time and resources than the video collection itself. To completely avoid such labelling overhead, the platform was designed to collect pre-labelled contents. It accomplishes this by prompting users with sign videos that have known contents and asking them to record their own version. This enabled the team to automatically label videos that users created with the prompt videos.<\/p>\n\n\n\n<p>The platform was community-focused in multiple ways. All website content was presented in both English and ASL; the recording prompts were in ASL; the project goals were shared explicitly; participants were able to verify one another\u2019s contributions; and the community-sourced dataset was made available as a community resource in the form of a dictionary.<\/p>\n\n\n\n<p>In designing the platform, the team engaged in an iterative process, incorporating feedback from community stakeholders and testers, and ran a pilot study with the platform to better understand the user experience and quality of collected data. For more about the platform design and pilot study, see: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/exploring-collection-of-sign-language-videos-through-crowdsourcing\/\" target=\"_blank\" rel=\"noreferrer noopener\">Exploring Collection of Sign Language Videos through Crowdsourcing<\/a>. The team also experimented with crowdsourcing videos of complete sentences, which are required for more complete sign language modeling, as discussed in <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/asl-wiki-an-exploratory-interface-for-crowdsourcing-asl-translations\/\" target=\"_blank\" rel=\"noreferrer noopener\">ASL Wiki: An Exploratory Interface for Crowdsourcing ASL Translations<\/a>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"improving-models-using-asl-citizen\">Improving models using ASL Citizen<\/h2>\n\n\n\n<p>A diverse group of experts helped bring ASL Citizen to life. Engineers and a designer at Microsoft helped build and scale the platform design; a well-known ASL professional recorded the prompt videos; and collaborators at <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.bu.edu\/academics\/wheelock\/programs\/deaf-studies\/\" target=\"_blank\" rel=\"noopener noreferrer\">Boston University\u2019s Deaf Center<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> provided feedback and managed participant recruitment and engagement. Deaf research team members were involved throughout. Consisting of about 84,000 videos of 2,700 distinct signs from ASL, the resulting dataset is the largest labelled ISLR dataset and the first crowdsourced ISLR dataset.<\/p>\n\n\n\n<p>Using the new dataset, the researchers adapted previous approaches to ISLR to the real-world task of looking up signs in a dictionary, and released a set of baselines for machine learning researchers to build upon, focusing on supervised deep learning methods. To establish baseline models, the team partnered with collaborators from the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.cs.washington.edu\/\" target=\"_blank\" rel=\"noopener noreferrer\">Paul G. Allen School of Computer Science and Engineering<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> at the University of Washington. Comparison to prior datasets was difficult because each dataset consists of a different vocabulary. However, compared to the best prior dataset, using just overlapping vocabulary with one baseline, the new dataset boosts performance from 16% to 71% accuracy. Even without algorithmic advances, training and testing on ASL Citizen improves ISLR accuracy compared to prior work, despite spanning a larger vocabulary and testing on completely unseen users.<\/p>\n\n\n\n<div class=\"wp-block-group is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"annotations \" data-bi-aN=\"citation\">\n\t<article class=\"annotations__list card depth-16 bg-body p-4 \">\n\t\t<div class=\"annotations__list-item\">\n\t\t\t\t\t\t<span class=\"annotations__type d-block text-uppercase font-weight-semibold text-neutral-300 small\">Project<\/span>\n\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/asl-citizen\/\" data-bi-cN=\"ASL Citizen\" data-external-link=\"false\" data-bi-aN=\"citation\" data-bi-type=\"annotated-link\" class=\"annotations__link font-weight-semibold text-decoration-none\"><span>ASL Citizen<\/span>&nbsp;<span class=\"glyph-in-link glyph-append glyph-append-chevron-right\" aria-hidden=\"true\"><\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n\n\n\n<div class=\"annotations \" data-bi-aN=\"citation\">\n\t<article class=\"annotations__list card depth-16 bg-body p-4 \">\n\t\t<div class=\"annotations__list-item\">\n\t\t\t\t\t\t<span class=\"annotations__type d-block text-uppercase font-weight-semibold text-neutral-300 small\">Publication<\/span>\n\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/asl-citizen-a-community-sourced-dataset-for-advancing-isolated-sign-language-recognition\/\" data-bi-cN=\"ASL Citizen: A Community-Sourced Dataset for Advancing Isolated Sign Language Recognition\" data-external-link=\"false\" data-bi-aN=\"citation\" data-bi-type=\"annotated-link\" class=\"annotations__link font-weight-semibold text-decoration-none\"><span>ASL Citizen: A Community-Sourced Dataset for Advancing Isolated Sign Language Recognition<\/span>&nbsp;<span class=\"glyph-in-link glyph-append glyph-append-chevron-right\" aria-hidden=\"true\"><\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"annotations \" data-bi-aN=\"citation\">\n\t<article class=\"annotations__list card depth-16 bg-body p-4 \">\n\t\t<div class=\"annotations__list-item\">\n\t\t\t\t\t\t<span class=\"annotations__type d-block text-uppercase font-weight-semibold text-neutral-300 small\">Tool<\/span>\n\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/download\/details.aspx?id=105253\" data-bi-cN=\"ASL Citizen dataset\" target=\"_blank\" rel=\"noopener noreferrer\" data-external-link=\"true\" data-bi-aN=\"citation\" data-bi-type=\"annotated-link\" class=\"annotations__link font-weight-semibold text-decoration-none\"><span>ASL Citizen dataset<\/span>&nbsp;<span class=\"glyph-in-link glyph-append glyph-append-open-in-new-tab\" aria-hidden=\"true\"><\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n\n\n\n<div class=\"annotations \" data-bi-aN=\"citation\">\n\t<article class=\"annotations__list card depth-16 bg-body p-4 \">\n\t\t<div class=\"annotations__list-item\">\n\t\t\t\t\t\t<span class=\"annotations__type d-block text-uppercase font-weight-semibold text-neutral-300 small\">Tool<\/span>\n\t\t\t<a href=\"https:\/\/github.com\/microsoft\/ASL-citizen-code\" data-bi-cN=\"ASL Citizen baseline code\" target=\"_blank\" rel=\"noopener noreferrer\" data-external-link=\"true\" data-bi-aN=\"citation\" data-bi-type=\"annotated-link\" class=\"annotations__link font-weight-semibold text-decoration-none\"><span>ASL Citizen baseline code<\/span>&nbsp;<span class=\"glyph-in-link glyph-append glyph-append-open-in-new-tab\" aria-hidden=\"true\"><\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"building-on-past-efforts\">Building on past efforts<\/h2>\n\n\n\n<p>The ASL Citizen project is part of Microsoft\u2019s mission of empowerment and societal impact and part of Bragg\u2019s focus on advancing sign language technology. Doing this effectively requires human-centered, interdisciplinary work.&nbsp; Because sign language is central to Deaf culture and identity, developing successful sign language AI requires not only technical work, but also deep understanding of the community and alignment of technology design with their perspectives.&nbsp;<\/p>\n\n\n\n<p>Prior work informing the ASL Citizen dataset included a <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/sign-language-recognition-generation-and-translation-an-interdisciplinary-perspective\/\" target=\"_blank\" rel=\"noreferrer noopener\">workshop on sign language AI<\/a> at Microsoft in 2019, which convened diverse thought leaders from academia, industry, and the Deaf community to discuss the state of sign language computation. The resulting Best Paper at ASSETS 2019: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/sign-language-recognition-generation-and-translation-an-interdisciplinary-perspective\/\" target=\"_blank\" rel=\"noreferrer noopener\">Sign Language Recognition, Generation, and Translation: An Interdisciplinary Perspective<\/a>, outlined the state of the art, the field\u2019s biggest challenges, and calls to action. This work helped establish data challenges as a major limitation, and highlighted the importance of Deaf community involvement.<\/p>\n\n\n\n<p>Aware of sensitivities to sign language technology development, Bragg and colleagues also mapped out the ethics of sign language AI datasets in a 2021 paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/the-fate-landscape-of-sign-language-ai-datasets-an-interdisciplinary-perspective\/\" target=\"_blank\" rel=\"noreferrer noopener\">The FATE Landscape of Sign Language AI Datasets: An Interdisciplinary Perspective<\/a>. This paper establishes the impact of data choices on models and users, and discusses other complex issues, including data ownership, data sharing, and transparency around sign language AI. To help address such issues, Bragg and collaborators experimented with disguising people\u2019s faces in sign language videos and examined the impact on model performance in a 2020 paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/exploring-collection-of-sign-language-datasets-privacy-participation-and-model-performance\/\" target=\"_blank\" rel=\"noreferrer noopener\">Exploring Collection of Sign Language Datasets: Privacy, Participation, and Model Performance<\/a>. Designing sign language data collections to maximize benefits while minimizing harms is hard, and design decisions involve tradeoffs.<\/p>\n\n\n\n<p>To better understand the eventual use cases for sign language AI built using datasets like ASL Citizen, Bragg and collaborators studied community perspectives on sign language AI in another 2023 paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/u-s-deaf-community-perspectives-on-automatic-sign-language-translation\/\" target=\"_blank\" rel=\"noreferrer noopener\">U.S. Deaf Community Perspectives on Automatic Sign Language Translation<\/a>, which outlines a survey of Deaf community perspectives on ASL translation.<\/p>\n\n\n\n<p>As large language models and deep learning continue to develop, Bragg expects that high-quality representative training data will become increasingly essential. She hopes that the team\u2019s work can serve as an example of how engaging with communities can also help to advance ML.<\/p>\n\n\n\n<div class=\"annotations \" data-bi-aN=\"citation\">\n\t<article class=\"annotations__list card depth-16 bg-body p-4 \">\n\t\t<div class=\"annotations__list-item\">\n\t\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/podcast\/accessible-systems-for-sign-language-computation-with-dr-danielle-bragg\/\" target=\"_self\" aria-label=\"Accessible systems for sign language computation with Dr. Danielle Bragg\" data-bi-type=\"annotated-link\" data-bi-cN=\"Accessible systems for sign language computation with Dr. Danielle Bragg\" class=\"annotations__list-thumbnail\" >\n\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"170\" height=\"96\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-300x169.png\" class=\"mb-2\" alt=\"head shot of Danielle Bragg for the Microsoft Research Podcast\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-300x169.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-1024x577.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-768x432.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-1536x865.png 1536w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-2048x1153.png 2048w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-1066x600.png 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-655x368.png 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-343x193.png 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-640x360.png 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-960x540.png 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-1280x720.png 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/06\/DanielleBragg_podcast_1400x788_No-logos-1920x1080.png 1920w\" sizes=\"auto, (max-width: 170px) 100vw, 170px\" \/>\t\t\t\t<\/a>\n\t\t\t\t\t\t\t<span class=\"annotations__type d-block text-uppercase font-weight-semibold text-neutral-300 small\">Podcast<\/span>\n\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/podcast\/accessible-systems-for-sign-language-computation-with-dr-danielle-bragg\/\" data-bi-cN=\"Accessible systems for sign language computation with Dr. Danielle Bragg\" data-external-link=\"false\" data-bi-aN=\"citation\" data-bi-type=\"annotated-link\" class=\"annotations__link font-weight-semibold text-decoration-none\"><span>Accessible systems for sign language computation with Dr. Danielle Bragg<\/span>&nbsp;<span class=\"glyph-in-link glyph-append glyph-append-chevron-right\" aria-hidden=\"true\"><\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>ASL Citizen is the first crowdsourced sign language dataset, advancing the state of the art in sign recognition. The web-based project captured input from people in real-world settings, and from a diverse group of experts, including Deaf team members.<\/p>\n","protected":false},"author":42735,"featured_media":988077,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":[],"msr_hide_image_in_river":0,"footnotes":""},"categories":[1],"tags":[],"research-area":[13556,13545,13554,13558],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[243984],"msr-impact-theme":[261667],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-987693","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research-blog","msr-research-area-artificial-intelligence","msr-research-area-human-language-technologies","msr-research-area-human-computer-interaction","msr-research-area-security-privacy-cryptography","msr-locale-en_us","msr-post-option-blog-homepage-featured"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[199563,199571],"msr_impact_theme":["Empowerment"],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[],"related-projects":[614286,932478],"related-events":[],"related-researchers":[],"msr_type":"Post","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-960x540.png\" class=\"img-object-cover\" alt=\"Blue to green gradient. Two rows of hands: the top row signing ASL and the bottom row signing Data.\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-960x540.png 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-300x169.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-1024x576.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-768x432.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-1066x600.png 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-655x368.png 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-343x193.png 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-240x135.png 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-640x360.png 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1-1280x720.png 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2023\/11\/SLD-BlogHeroFeature-1400x788-1.png 1400w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","byline":"","formattedDate":"December 4, 2023","formattedExcerpt":"ASL Citizen is the first crowdsourced sign language dataset, advancing the state of the art in sign recognition. The web-based project captured input from people in real-world settings, and from a diverse group of experts, including Deaf team members.","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/987693","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/42735"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=987693"}],"version-history":[{"count":24,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/987693\/revisions"}],"predecessor-version":[{"id":989010,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/987693\/revisions\/989010"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/988077"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=987693"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=987693"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=987693"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=987693"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=987693"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=987693"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=987693"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=987693"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=987693"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=987693"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=987693"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}