{"id":694200,"date":"2020-09-22T16:12:17","date_gmt":"2020-09-22T23:12:17","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&#038;p=694200"},"modified":"2020-09-22T16:24:53","modified_gmt":"2020-09-22T23:24:53","slug":"assessing-and-mitigating-unfairness-in-credit-models-with-the-fairlearn-toolkit","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/assessing-and-mitigating-unfairness-in-credit-models-with-the-fairlearn-toolkit\/","title":{"rendered":"Assessing and mitigating unfairness in credit models with the Fairlearn toolkit"},"content":{"rendered":"<p>As AI plays an increasing role in the financial services industry, it is essential that financial services organizations anticipate and mitigate unintended consequences, including fairness-related harms, such as denying people services, initiating predatory lending, amplifying gender or racial biases, or violating laws such as the United States\u2019 <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/www.govinfo.gov\/content\/pkg\/USCODE-2011-title15\/html\/USCODE-2011-title15-chap41-subchapIV.htm\">Equal Credit Opportunity Act<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> (ECOA). To address these kinds of harms, fairness must be explicitly prioritized throughout the AI development and deployment lifecycle.<\/p>\n<p>To help organizations prioritizing fairness in AI systems, Microsoft has released an open-source toolkit called <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/fairlearn.github.io\/\">Fairlearn<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. This toolkit\u202ffocuses\u202fon the assessment and mitigation of fairness-related harms that affect\u202fgroups of people, such as those defined in terms of race, sex, age, or disability status.<\/p>\n<p>Using a dataset of loan applications, we illustrate how a machine learning model trained with standard algorithms can lead to unfairness in a loan adjudication scenario, and how Fairlearn can be used to assess and mitigate this unfairness. The model, which is obtained by thresholding the predictions of probability of default (PD), leads to an uneven distribution of adverse events for the \u201cmale\u201d group compared to the \u201cfemale\u201d group even though this model does not use sex as one of its inputs. Fairlearn\u2019s mitigation algorithms reduce this disparity from 8 percentage points to 1 percentage point without any (statistically significant) impact on the to the financial services organization.<\/p>\n<p>We emphasize that fairness in AI is a sociotechnical challenge, so no software toolkit will \u201csolve\u201d fairness in all AI systems. However, software toolkits like Fairlearn can still play a valuable role in developing fairer AI systems\u2014as long as they are precise and targeted, embedded within a holistic risk management framework, and supplemented with additional resources and processes.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>As AI plays an increasing role in the financial services industry, it is essential that financial services organizations anticipate and mitigate unintended consequences, including fairness-related harms, such as denying people services, initiating predatory lending, amplifying gender or racial biases, or violating laws such as the United States\u2019 Equal Credit Opportunity Act (ECOA). To address these [&hellip;]<\/p>\n","protected":false},"featured_media":660288,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_publishername":"","msr_publisher_other":"","msr_booktitle":"","msr_chapter":"","msr_edition":"","msr_editors":"","msr_how_published":"","msr_isbn":"","msr_issue":"","msr_journal":"","msr_number":"MSR-TR-2020-34","msr_organization":"Microsoft","msr_pages_string":"","msr_page_range_start":"","msr_page_range_end":"","msr_series":"","msr_volume":"","msr_copyright":"","msr_conference_name":"","msr_doi":"","msr_arxiv_id":"","msr_s2_paper_id":"","msr_mag_id":"","msr_pubmed_id":"","msr_other_authors":"","msr_other_contributors":"","msr_speaker":"","msr_award":"","msr_affiliation":"","msr_institution":"","msr_host":"","msr_version":"","msr_duration":"","msr_original_fields_of_study":"","msr_release_tracker_id":"","msr_s2_match_type":"","msr_citation_count_updated":"","msr_published_date":"2020-9-22","msr_highlight_text":"","msr_notes":"","msr_longbiography":"","msr_publicationurl":"","msr_external_url":"","msr_secondary_video_url":"","msr_conference_url":"","msr_journal_url":"","msr_s2_pdf_url":"","msr_year":0,"msr_citation_count":0,"msr_influential_citations":0,"msr_reference_count":0,"msr_s2_match_confidence":0,"msr_microsoftintellectualproperty":true,"msr_s2_open_access":false,"msr_s2_author_ids":[],"msr_pub_ids":[],"msr_hide_image_in_river":0,"footnotes":""},"msr-research-highlight":[],"research-area":[13556],"msr-publication-type":[193718],"msr-publisher":[],"msr-focus-area":[],"msr-locale":[268875],"msr-post-option":[],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-694200","msr-research-item","type-msr-research-item","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-locale-en_us"],"msr_publishername":"","msr_edition":"","msr_affiliation":"","msr_published_date":"2020-9-22","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"MSR-TR-2020-34","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"Microsoft","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"file","viewUrl":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/09\/Fairlearn-EY_WhitePaper-2020-09-22.pdf","id":"694203","title":"fairlearn-ey_whitepaper-2020-09-22","label_id":"243109","label":0}],"msr_related_uploader":"","msr_citation_count":0,"msr_citation_count_updated":"","msr_s2_paper_id":"","msr_influential_citations":0,"msr_reference_count":0,"msr_arxiv_id":"","msr_s2_author_ids":[],"msr_s2_open_access":false,"msr_s2_pdf_url":null,"msr_attachments":[{"id":694203,"url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/09\/Fairlearn-EY_WhitePaper-2020-09-22.pdf"}],"msr-author-ordering":[{"type":"user_nicename","value":"Miro Dud\u00edk","user_id":32867,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Miro Dud\u00edk"},{"type":"text","value":"William Chen","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Solon Barocas","user_id":36051,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Solon Barocas"},{"type":"text","value":"Mario Inchiosa","user_id":0,"rest_url":false},{"type":"text","value":"Nick Lewins","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Miruna Oprescu","user_id":37353,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Miruna Oprescu"},{"type":"text","value":"Joy Qiao","user_id":0,"rest_url":false},{"type":"text","value":"Mehrnoosh Sameki","user_id":0,"rest_url":false},{"type":"text","value":"Mario Schlener","user_id":0,"rest_url":false},{"type":"text","value":"Jason Tuo","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Hanna Wallach","user_id":34779,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Hanna Wallach"}],"msr_impact_theme":[],"msr_research_lab":[],"msr_event":[],"msr_group":[372368],"msr_project":[],"publication":[],"video":[],"msr-tool":[],"msr_publication_type":"techreport","related_content":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/694200","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":2,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/694200\/revisions"}],"predecessor-version":[{"id":694218,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/694200\/revisions\/694218"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/660288"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=694200"}],"wp:term":[{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=694200"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=694200"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=694200"},{"taxonomy":"msr-publisher","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publisher?post=694200"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=694200"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=694200"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=694200"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=694200"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=694200"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=694200"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=694200"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=694200"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}