{"id":372368,"date":"2017-03-20T11:42:31","date_gmt":"2017-03-20T18:42:31","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-group&#038;p=372368"},"modified":"2025-11-20T11:25:32","modified_gmt":"2025-11-20T19:25:32","slug":"fate","status":"publish","type":"msr-group","link":"https:\/\/www.microsoft.com\/en-us\/research\/theme\/fate\/","title":{"rendered":"FATE: Fairness, Accountability, Transparency & Ethics in AI"},"content":{"rendered":"<section class=\"mb-3 moray-highlight\">\n\t<div class=\"card-img-overlay mx-lg-0\">\n\t\t<div class=\"card-background  has-background-catalina-blue card-background--full-bleed\">\n\t\t\t\t\t<\/div>\n\t\t<!-- Foreground -->\n\t\t<div class=\"card-foreground d-flex mt-md-n5 my-lg-5 px-g px-lg-0\">\n\t\t\t<!-- Container -->\n\t\t\t<div class=\"container d-flex mt-md-n5 my-lg-5 align-self-center\">\n\t\t\t\t<!-- Card wrapper -->\n\t\t\t\t<div class=\"w-100 w-lg-col-5\">\n\t\t\t\t\t<!-- Card -->\n\t\t\t\t\t<div class=\"card material-md-card py-5 px-md-5\">\n\t\t\t\t\t\t<div class=\"card-body \">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/lab\/microsoft-research-new-york\/\" class=\"icon-link icon-link--reverse mb-2\" data-bi-cN=\"Microsoft Research Lab \u2013 New York City\">\n\t\t\t\t\t\t\t\t\t<span class=\"c-glyph glyph-chevron-left\" aria-hidden=\"true\"><\/span>\n\t\t\t\t\t\t\t\t\tMicrosoft Research Lab \u2013 New York City\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n<h1 class=\"wp-block-heading h2\" id=\"fate-fairness-accountability-transparency-and-ethics-in-ai\">FATE: Fairness, Accountability, Transparency, and Ethics in AI<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n<figure class=\"wp-block-image alignright wp-image-596905 is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"178\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/03\/FATEpicture-300x178.png\" alt=\"FATE and friends group photo\" class=\"wp-image-596905\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/03\/FATEpicture-300x178.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/03\/FATEpicture-768x455.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/03\/FATEpicture-1024x606.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/03\/FATEpicture.png 1149w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><figcaption class=\"wp-element-caption\">FATE and friends<\/figcaption><\/figure>\n\n\n\n<p class=\"has-text-align-left\">We study the complex societal implications of artificial intelligence (AI), machine learning (ML), and natural language processing (NLP). Our aim is to facilitate computational techniques that are both innovative and responsible, while prioritizing issues of fairness, accountability, transparency, and ethics as they relate to AI, ML, and NLP by drawing on fields with a sociotechnical orientation, such as HCI, information science, sociology, anthropology, science and technology studies, media studies, political science, and law.<\/p>\n\n\n\n<p class=\"has-text-align-left\">We work closely with other research groups within Microsoft, such as the <a href=\"https:\/\/www.microsoft.com\/research\/group\/stac-sociotechnical-alignment-center\/\">Sociotechnical Alignment Center<\/a> and the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/socialmediacollective.org\/\" target=\"_blank\" rel=\"noopener noreferrer\">Social Media Collective<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. We <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/theme\/fate\/#!publications\">publish our work<\/a> in a variety of academic and other venues.<\/p>\n\n\n\n<p><strong>NEWS: <\/strong>We are hiring <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/apply.careers.microsoft.com\/careers\/job\/1970393556627874\">interns<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> and <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/apply.careers.microsoft.com\/careers\/job?pid=1970393556626636\">postdocs<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> to start in summer 2026! &nbsp;Apply by <strong>December 15, 2025<\/strong> for full consideration.<\/p>\n\n\n","protected":false},"excerpt":{"rendered":"<p>We study the complex societal implications of artificial intelligence (AI), machine learning (ML), and natural language processing (NLP). Our aim is to facilitate computational techniques that are both innovative and responsible, while prioritizing issues of fairness, accountability, transparency, and ethics as they relate to AI, ML, and NLP by drawing on fields with a sociotechnical orientation, such as HCI, information science, sociology, anthropology, science and technology studies, media studies, political science, and law.<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_group_start":"","footnotes":""},"research-area":[13556,13554,13559],"msr-group-type":[243688],"msr-locale":[268875],"msr-impact-theme":[],"class_list":["post-372368","msr-group","type-msr-group","status-publish","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-research-area-social-sciences","msr-group-type-theme","msr-locale-en_us"],"msr_group_start":"","msr_detailed_description":"","msr_further_details":"","msr_hero_images":[],"msr_research_lab":[199571,437514],"related-researchers":[{"type":"user_nicename","display_name":"Agathe Balayn","user_id":43641,"people_section":"Group 1","alias":"balaynagathe"},{"type":"user_nicename","display_name":"Solon Barocas","user_id":36051,"people_section":"Group 1","alias":"solon"},{"type":"user_nicename","display_name":"Kate Crawford","user_id":32494,"people_section":"Group 1","alias":"kate"},{"type":"user_nicename","display_name":"Miro Dud\u00edk","user_id":32867,"people_section":"Group 1","alias":"mdudik"},{"type":"user_nicename","display_name":"Darya Moldavskaya","user_id":43569,"people_section":"Group 1","alias":"dmoldavskaya"},{"type":"user_nicename","display_name":"Matthew Vogel","user_id":43560,"people_section":"Group 1","alias":"mavoge"},{"type":"user_nicename","display_name":"Hanna Wallach","user_id":34779,"people_section":"Group 1","alias":"wallach"},{"type":"user_nicename","display_name":"Jenn Wortman Vaughan","user_id":32235,"people_section":"Group 1","alias":"jenn"}],"related-publications":[579202,563964,1081281,896697,1049067,1049058,1049034,1030986,1030965,999801,996486,996246,964446,951228,950433,934374,923319,923313,920544,919683,919101,919095,919089,912675,904230,894141,1134889,1161857,1147541,1147535,1147506,1147483,1147480,1147478,1147476,1144124,1138603,1136227,1108629,1134768,1134753,1126449,1115496,1115490,1115484,1114086,1114080,1114071,1113531,1113519,647091,687096,687048,684486,683982,683943,663921,663915,661503,660249,654621,651078,647379,687102,647082,645714,644547,644349,625320,620094,600966,594775,559506,559500,559494,815431,876204,863979,863790,854574,851563,851524,851488,840211,833854,817582,815839,559482,766810,763549,763528,750562,749203,748450,726181,710506,694200,688662,687108],"related-downloads":[],"related-videos":[665562,736096,747361,748126,1003980],"related-projects":[716050,721693,721708],"related-events":[741676,729871,728776,632442,632154],"related-opportunities":[1156240,1162988],"related-posts":[469611,491609,583696,563847,572,494648,640473,659955,680358,965166,969147,1032900,1156686],"tab-content":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-group"}],"version-history":[{"count":40,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368\/revisions"}],"predecessor-version":[{"id":1156241,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368\/revisions\/1156241"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=372368"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=372368"},{"taxonomy":"msr-group-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group-type?post=372368"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=372368"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=372368"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}