{"id":716050,"date":"2021-02-19T09:28:53","date_gmt":"2021-02-19T17:28:53","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&#038;p=716050"},"modified":"2022-02-06T20:37:21","modified_gmt":"2022-02-07T04:37:21","slug":"ai-fairness-checklist","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/ai-fairness-checklist\/","title":{"rendered":"AI Fairness Checklist"},"content":{"rendered":"<section class=\"mb-3 moray-highlight\">\n\t<div class=\"card-img-overlay mx-lg-0\">\n\t\t<div class=\"card-background  has-background- card-background--full-bleed\">\n\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"2089\" height=\"720\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2.jpg\" class=\"attachment-full size-full\" alt=\"illustration of algorithms and checkboxes on blue background\" style=\"\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2.jpg 2089w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2-300x103.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2-1024x353.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2-768x265.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2-1536x529.jpg 1536w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2-2048x706.jpg 2048w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/01\/AI-Fairness-Checklist-v2-16x6.jpg 16w\" sizes=\"auto, (max-width: 2089px) 100vw, 2089px\" \/>\t\t<\/div>\n\t\t<!-- Foreground -->\n\t\t<div class=\"card-foreground d-flex mt-md-n5 my-lg-5 px-g px-lg-0\">\n\t\t\t<!-- Container -->\n\t\t\t<div class=\"container d-flex mt-md-n5 my-lg-5 align-self-center\">\n\t\t\t\t<!-- Card wrapper -->\n\t\t\t\t<div class=\"w-100 w-lg-col-5\">\n\t\t\t\t\t<!-- Card -->\n\t\t\t\t\t<div class=\"card material-md-card py-5 px-md-5\">\n\t\t\t\t\t\t<div class=\"card-body \">\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n<h1 id=\"ai-fairness-checklist\" class=\"h2\">AI Fairness Checklist<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n<p>Many organizations have published principles to guide the responsible development and deployment of AI systems, but they are largely left to practitioners to put them into practice. Other organizations have therefore produced AI ethics checklists, including checklists for specific concepts, such as fairness.<\/p>\n\n\n\n<p>Checklists in other domains, such as aviation, medicine, and structural engineering, have had well-documented success in saving lives and improving professional practices. But unless checklists are grounded in practitioners\u2019 needs, they may be misused or ignored.<\/p>\n\n\n\n<p>The fairness checklist research project explores how checklists may be designed to support the development of more fair AI products and services. To do this, we work with AI practitioners who the checklists are intended to support, to solicit their input on the checklist design and support the adoption and integration of the checklist into AI design, development, and deployment lifecycles.<\/p>\n\n\n\n<p>Our first studies in this project have led to a fairness checklist co-designed with practitioners, as well as insights into how organizational and team processes shape how AI teams address fairness harms.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--1\"><a data-bi-type=\"button\" class=\"wp-block-button__link\" href=\"https:\/\/query.prod.cms.rt.microsoft.com\/cms\/api\/am\/binary\/RE4t6dA\" target=\"_blank\" rel=\"noreferrer noopener\">Get the latest checklist<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-outline is-style-outline--2\"><a data-bi-type=\"button\" class=\"wp-block-button__link\" href=\"https:\/\/www.microsoft.com\/en-us\/ai\/responsible-ai-resources\" target=\"_blank\" rel=\"noreferrer noopener\">More Responsible AI resources<\/a><\/div>\n<\/div>\n\n\n","protected":false},"excerpt":{"rendered":"<p>The AI Fairness Checklist&#8217;s goal is to conduct an iterative co-design process with 48 AI practitioners across 12 technology companies to understand the role of checklists in AI ethics, focusing on fairness.\u200b<\/p>\n","protected":false},"featured_media":716641,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13556],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-716050","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[644349,1030986],"related-downloads":[],"related-videos":[],"related-groups":[372368],"related-events":[],"related-opportunities":[],"related-posts":[1032900],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Hanna Wallach","user_id":34779,"people_section":"Section name 0","alias":"wallach"},{"type":"user_nicename","display_name":"Jenn Wortman Vaughan","user_id":32235,"people_section":"Section name 0","alias":"jenn"}],"msr_research_lab":[199571],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/716050","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":8,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/716050\/revisions"}],"predecessor-version":[{"id":818656,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/716050\/revisions\/818656"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/716641"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=716050"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=716050"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=716050"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=716050"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=716050"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}