{"id":1073550,"date":"2024-08-15T17:29:59","date_gmt":"2024-08-16T00:29:59","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=1073550"},"modified":"2024-08-22T06:18:28","modified_gmt":"2024-08-22T13:18:28","slug":"abstracts-august-15-2024","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/podcast\/abstracts-august-15-2024\/","title":{"rendered":"Abstracts: August 15, 2024"},"content":{"rendered":"\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788.jpg\" alt=\"Microsoft Research Podcast - Abstracts\" class=\"wp-image-1057803\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/Episode-12_Abstracts_Hero_Feature_No_Text_1400x788-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/figure>\n\n\n<div class=\"wp-block-msr-podcast-container my-4\">\n\t<iframe loading=\"lazy\" src=\"https:\/\/player.blubrry.com\/?podcast_id=134878498&modern=1\" class=\"podcast-player\" frameborder=\"0\" height=\"164px\" width=\"100%\" scrolling=\"no\" title=\"Podcast Player\"><\/iframe>\n<\/div>\n\n\n\n<p>Members of the research community at Microsoft work continuously to advance their respective fields. <em>Abstracts<\/em> brings its audience to the cutting edge with them through short, compelling conversations about new and noteworthy achievements.<\/p>\n\n\n\n<p>In this episode, Microsoft Product Manager <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/shreyjain\/\">Shrey Jain<\/a> and OpenAI Research Scientist <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/www.zoehitzig.com\/\">Zo\u00eb Hitzig<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> join host Amber Tingle to discuss \u201c<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/personhood-credentials-artificial-intelligence-and-the-value-of-privacy-preserving-tools-to-distinguish-who-is-real-online\/\">Personhood credentials: Artificial intelligence and the value of privacy-preserving tools to distinguish who is real online<\/a>.\u201d In their paper, Jain, Hitzig, and their coauthors describe how malicious actors can draw on increasingly advanced AI tools to carry out deception, making online deception harder to detect and more harmful. Bringing ideas from cryptography into AI policy conversations, they identify a possible mitigation: a credential that allows its holder to prove they\u2019re a person\u2013\u2013not a bot\u2013\u2013without sharing any identifying information. This exploratory research reflects a broad range of collaborators from across industry, academia, and the civil sector specializing in areas such as security, digital identity, advocacy, and policy.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/personhood-credentials-artificial-intelligence-and-the-value-of-privacy-preserving-tools-to-distinguish-who-is-real-online\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\n\n<section class=\"wp-block-msr-subscribe-to-podcast subscribe-to-podcast\">\n\t<div class=\"subscribe-to-podcast__inner border-top border-bottom border-width-2\">\n\t\t<h2 class=\"h5 subscribe-to-podcast__heading\">\n\t\t\tSubscribe to the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/podcast\">Microsoft Research Podcast<\/a>:\t\t<\/h2>\n\t\t<ul class=\"subscribe-to-podcast__list list-unstyled\">\n\t\t\t\t\t\t\t<li class=\"subscribe-to-podcast__list-item\">\n\t\t\t\t\t<a class=\"subscribe-to-podcast__link\" href=\"https:\/\/itunes.apple.com\/us\/podcast\/microsoft-research-a-podcast\/id1318021537?mt=2\" target=\"_blank\" rel=\"noreferrer noopener\">\n\t\t\t\t\t\t<svg class=\"subscribe-to-podcast__svg\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" fill=\"black\" viewBox=\"0 0 32 32\">  <path d=\"M7.12 0c-3.937-0.011-7.131 3.183-7.12 7.12v17.76c-0.011 3.937 3.183 7.131 7.12 7.12h17.76c3.937 0.011 7.131-3.183 7.12-7.12v-17.76c0.011-3.937-3.183-7.131-7.12-7.12zM15.817 3.421c3.115 0 5.932 1.204 8.079 3.453 1.631 1.693 2.547 3.489 3.016 5.855 0.161 0.787 0.161 2.932 0.009 3.817-0.5 2.817-2.041 5.339-4.317 7.063-0.812 0.615-2.797 1.683-3.115 1.683-0.12 0-0.129-0.12-0.077-0.615 0.099-0.792 0.192-0.953 0.64-1.141 0.713-0.296 1.932-1.167 2.677-1.911 1.301-1.303 2.229-2.932 2.677-4.719 0.281-1.1 0.244-3.543-0.063-4.672-0.969-3.595-3.907-6.385-7.5-7.136-1.041-0.213-2.943-0.213-4 0-3.636 0.751-6.647 3.683-7.563 7.371-0.245 1.004-0.245 3.448 0 4.448 0.609 2.443 2.188 4.681 4.255 6.015 0.407 0.271 0.896 0.547 1.1 0.631 0.447 0.192 0.547 0.355 0.629 1.14 0.052 0.485 0.041 0.62-0.072 0.62-0.073 0-0.62-0.235-1.199-0.511l-0.052-0.041c-3.297-1.62-5.407-4.364-6.177-8.016-0.187-0.943-0.224-3.187-0.036-4.052 0.479-2.323 1.396-4.135 2.921-5.739 2.199-2.319 5.027-3.543 8.172-3.543zM16 7.172c0.541 0.005 1.068 0.052 1.473 0.14 3.715 0.828 6.344 4.543 5.833 8.229-0.203 1.489-0.713 2.709-1.619 3.844-0.448 0.573-1.537 1.532-1.729 1.532-0.032 0-0.063-0.365-0.063-0.803v-0.808l0.552-0.661c2.093-2.505 1.943-6.005-0.339-8.296-0.885-0.896-1.912-1.423-3.235-1.661-0.853-0.161-1.031-0.161-1.927-0.011-1.364 0.219-2.417 0.744-3.355 1.672-2.291 2.271-2.443 5.791-0.348 8.296l0.552 0.661v0.813c0 0.448-0.037 0.807-0.084 0.807-0.036 0-0.349-0.213-0.683-0.479l-0.047-0.016c-1.109-0.885-2.088-2.453-2.495-3.995-0.244-0.932-0.244-2.697 0.011-3.625 0.672-2.505 2.521-4.448 5.079-5.359 0.547-0.193 1.509-0.297 2.416-0.281zM15.823 11.156c0.417 0 0.828 0.084 1.131 0.24 0.645 0.339 1.183 0.989 1.385 1.677 0.62 2.104-1.609 3.948-3.631 3.005h-0.015c-0.953-0.443-1.464-1.276-1.475-2.36 0-0.979 0.541-1.828 1.484-2.328 0.297-0.156 0.709-0.235 1.125-0.235zM15.812 17.464c1.319-0.005 2.271 0.463 2.625 1.291 0.265 0.62 0.167 2.573-0.292 5.735-0.307 2.208-0.479 2.765-0.905 3.141-0.589 0.52-1.417 0.667-2.209 0.385h-0.004c-0.953-0.344-1.157-0.808-1.553-3.527-0.452-3.161-0.552-5.115-0.285-5.735 0.348-0.823 1.296-1.285 2.624-1.291z\"\/><\/svg>\n\t\t\t\t\t\t<span class=\"subscribe-to-podcast__link-text\">Apple Podcasts<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/li>\n\t\t\t\n\t\t\t\t\t\t\t<li class=\"subscribe-to-podcast__list-item\">\n\t\t\t\t\t<a class=\"subscribe-to-podcast__link\" href=\"https:\/\/subscribebyemail.com\/www.blubrry.com\/feeds\/microsoftresearch.xml\" target=\"_blank\" rel=\"noreferrer noopener\">\n\t\t\t\t\t\t<svg class=\"subscribe-to-podcast__svg\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" fill=\"none\" viewBox=\"0 0 32 32\"><path fill=\"currentColor\" d=\"M6.4 6a2.392 2.392 0 00-2.372 2.119L16 15.6l11.972-7.481A2.392 2.392 0 0025.6 6H6.4zM4 10.502V22.8a2.4 2.4 0 002.4 2.4h19.2a2.4 2.4 0 002.4-2.4V10.502l-11.365 7.102a1.2 1.2 0 01-1.27 0L4 10.502z\"\/><\/svg>\n\t\t\t\t\t\t<span class=\"subscribe-to-podcast__link-text\">Email<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/li>\n\t\t\t\n\t\t\t\t\t\t\t<li class=\"subscribe-to-podcast__list-item\">\n\t\t\t\t\t<a class=\"subscribe-to-podcast__link\" href=\"https:\/\/subscribeonandroid.com\/www.blubrry.com\/feeds\/microsoftresearch.xml\" target=\"_blank\" rel=\"noreferrer noopener\">\n\t\t\t\t\t\t<svg class=\"subscribe-to-podcast__svg\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" fill=\"none\" viewBox=\"0 0 32 32\"><path fill=\"currentColor\" d=\"M12.414 4.02c-.062.012-.126.023-.18.06a.489.489 0 00-.12.675L13.149 6.3c-1.6.847-2.792 2.255-3.18 3.944h13.257c-.388-1.69-1.58-3.097-3.179-3.944l1.035-1.545a.489.489 0 00-.12-.675.492.492 0 00-.675.135l-1.14 1.68a7.423 7.423 0 00-2.55-.45c-.899 0-1.758.161-2.549.45l-1.14-1.68a.482.482 0 00-.494-.195zm1.545 3.824a.72.72 0 110 1.44.72.72 0 010-1.44zm5.278 0a.719.719 0 110 1.44.719.719 0 110-1.44zM8.44 11.204A1.44 1.44 0 007 12.644v6.718c0 .795.645 1.44 1.44 1.44.168 0 .33-.036.48-.09v-9.418a1.406 1.406 0 00-.48-.09zm1.44 0V21.76c0 .793.646 1.44 1.44 1.44h10.557c.793 0 1.44-.647 1.44-1.44V11.204H9.878zm14.876 0c-.169 0-.33.035-.48.09v9.418c.15.052.311.09.48.09a1.44 1.44 0 001.44-1.44v-6.719a1.44 1.44 0 00-1.44-1.44zM11.8 24.16v1.92a1.92 1.92 0 003.84 0v-1.92h-3.84zm5.759 0v1.92a1.92 1.92 0 003.84 0v-1.92h-3.84z\"\/><\/svg>\n\t\t\t\t\t\t<span class=\"subscribe-to-podcast__link-text\">Android<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/li>\n\t\t\t\n\t\t\t\t\t\t\t<li class=\"subscribe-to-podcast__list-item\">\n\t\t\t\t\t<a class=\"subscribe-to-podcast__link\" href=\"https:\/\/open.spotify.com\/show\/4ndjUXyL0hH1FXHgwIiTWU\" target=\"_blank\" rel=\"noreferrer noopener\">\n\t\t\t\t\t\t<svg class=\"subscribe-to-podcast__svg\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" fill=\"none\" viewBox=\"0 0 32 32\"><path fill=\"currentColor\" d=\"M16 4C9.383 4 4 9.383 4 16s5.383 12 12 12 12-5.383 12-12S22.617 4 16 4zm5.08 17.394a.781.781 0 01-1.086.217c-1.29-.86-3.477-1.434-5.303-1.434-1.937.002-3.389.477-3.403.482a.782.782 0 11-.494-1.484c.068-.023 1.71-.56 3.897-.562 1.826 0 4.365.492 6.171 1.696.36.24.457.725.217 1.085zm1.56-3.202a.895.895 0 01-1.234.286c-2.338-1.457-4.742-1.766-6.812-1.747-2.338.02-4.207.466-4.239.476a.895.895 0 11-.488-1.723c.145-.041 2.01-.5 4.564-.521 2.329-.02 5.23.318 7.923 1.995.419.26.547.814.286 1.234zm1.556-3.745a1.043 1.043 0 01-1.428.371c-2.725-1.6-6.039-1.94-8.339-1.942h-.033c-2.781 0-4.923.489-4.944.494a1.044 1.044 0 01-.474-2.031c.096-.023 2.385-.55 5.418-.55h.036c2.558.004 6.264.393 9.393 2.23.497.292.663.931.371 1.428z\"\/><\/svg>\n\t\t\t\t\t\t<span class=\"subscribe-to-podcast__link-text\">Spotify<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/li>\n\t\t\t\n\t\t\t\t\t\t\t<li class=\"subscribe-to-podcast__list-item\">\n\t\t\t\t\t<a class=\"subscribe-to-podcast__link\" href=\"https:\/\/www.blubrry.com\/feeds\/microsoftresearch.xml\" target=\"_blank\" rel=\"noreferrer noopener\">\n\t\t\t\t\t\t<svg class=\"subscribe-to-podcast__svg\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" fill=\"none\" viewBox=\"0 0 32 32\"><path fill=\"currentColor\" d=\"M6.667 4a2.676 2.676 0 00-2.612 2.13v.003c-.036.172-.055.35-.055.534v18.666c0 .183.019.362.055.534v.003a2.676 2.676 0 002.076 2.075h.002c.172.036.35.055.534.055h18.666A2.676 2.676 0 0028 25.333V6.667a2.676 2.676 0 00-2.13-2.612h-.003A2.623 2.623 0 0025.333 4H6.667zM8 8h1.333C17.42 8 24 14.58 24 22.667V24h-2.667v-1.333c0-6.618-5.382-12-12-12H8V8zm0 5.333h1.333c5.146 0 9.334 4.188 9.334 9.334V24H16v-1.333A6.674 6.674 0 009.333 16H8v-2.667zM10 20a2 2 0 11-.001 4.001A2 2 0 0110 20z\"\/><\/svg>\n\t\t\t\t\t\t<span class=\"subscribe-to-podcast__link-text\">RSS Feed<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/li>\n\t\t\t\t\t<\/ul>\n\t<\/div>\n<\/section>\n\n\n<div class=\"wp-block-msr-show-more\">\n\t<div class=\"bg-neutral-100 p-5\">\n\t\t<div class=\"show-more-show-less\">\n\t\t\t<div>\n\t\t\t\t<span>\n\t\t\t\t\t\n\n<h2 class=\"wp-block-heading\" id=\"transcript\">Transcript<\/h2>\n\n\n\n<p>[MUSIC]<\/p>\n\n\n\n<p><strong>AMBER TINGLE: <\/strong>Welcome to <em>Abstracts<\/em>, a Microsoft Research Podcast that puts the spotlight on world-class research\u2014in brief. I&#8217;m Amber Tingle. In this series, members of the research community at Microsoft give us a quick snapshot\u2014or a <em>podcast abstract<\/em>\u2014of their new and noteworthy papers.<\/p>\n\n\n\n<p>[MUSIC FADES]<\/p>\n\n\n\n<p>Our guests today are Shrey Jain and Zo\u00eb Hitzig. Shrey is a product manager at Microsoft, and Zo\u00eb is a research scientist at OpenAI. They are two of the corresponding authors on a new paper, \u201cPersonhood credentials: Artificial intelligence and the value of privacy-preserving tools to distinguish who is real online.\u201d This exploratory research comprises multidisciplinary collaborators from across industry, academia, and the civil sector. The paper is available now on arXiv. Shrey and Zo\u00eb, thank you so much for joining us, and welcome back to the Microsoft Research Podcast.<\/p>\n\n\n\n\t\t\t\t<\/span>\n\t\t\t\t<span id=\"show-more-show-less-toggle-1\" class=\"show-more-show-less-toggleable-content\">\n\t\t\t\t\t\n\n\n\n<p><strong>SHREY JAIN:<\/strong> Thank you. We&#8217;re happy to be back.<\/p>\n\n\n\n<p><strong>ZO\u00cb HITZIG:<\/strong> Thanks so much.<\/p>\n\n\n\n<p><strong>TINGLE:<\/strong> Shrey, let&#8217;s start with a brief overview of your paper. Why is this research important, and why do you think this is something we should all know about?<\/p>\n\n\n\n<p><strong>JAIN: <\/strong>Malicious actors have been exploiting anonymity as a way to deceive others online. And historically, deception has been viewed as this unfortunate but necessary cost as a way to preserve the internet&#8217;s commitment to privacy and unrestricted access to information. And today, AI is changing the way we should think about malicious actors&#8217; ability to be successful in those attacks. It makes it easier to create content that is indistinguishable from human-created content, and it is possible to do so in a way that is only getting cheaper and more accessible. And so this paper aims to address a countermeasure to protect against AI-powered deception at scale while also protecting privacy. And I think the reason why people should care about this problem is for two reasons. One is it can very soon become very logistically annoying to deal with these various different types of scams that can occur. I think we&#8217;ve all been susceptible to different types of attacks or scams that, you know, people have had. But now these scams are going to become much more persuasive and effective. And so for various different recovery purposes, it can become very challenging to get access back to your accounts or rebuild your reputation that someone may damage online. But more importantly, there&#8217;s also very dangerous things that can happen. Kids might not be safe online anymore. Or our ability to communicate online for democratic processes. A lot of the way in which we shape political views today happens online. And that&#8217;s also at risk. And in response to that, we propose in this paper a solution titled <em>personhood credentials<\/em>. Personhood credentials enable people to prove that they are in fact a real person without revealing anything more about themselves online.<\/p>\n\n\n\n<p><strong>TINGLE:<\/strong> Zo\u00eb, walk us through what&#8217;s already been done in this field, and what&#8217;s your unique contribution to the literature here?<\/p>\n\n\n\n<p><strong>HITZIG:<\/strong> I see us as intervening on two separate bodies of work. And part of what we&#8217;re doing in this paper is bringing together those two bodies of work. There&#8217;s been absolutely amazing work for decades in cryptography and in security. And what cryptographers have been able to do is to figure out protocols that allow people to prove very specific claims about themselves without revealing their full identity. So when you think about walking into a bar and the bartender asks you to prove that you&#8217;re over 21\u2014or over 18, depending on where you are\u2014you typically have to show your full driver&#8217;s license. And now that&#8217;s revealing a lot of information. It says, you know, where you live, whether you&#8217;re an organ donor. It&#8217;s revealing a lot of information to that bartender. And online, we don&#8217;t know what different service providers are storing about us. So, you know, the bartender might not really care where we live or whether we&#8217;re an organ donor. But when we&#8217;re signing up for digital services and we have to show a highly revealing credential like a driver&#8217;s license just to get access to something, we&#8217;re giving over too much information in some sense. And so this one body of literature that we&#8217;re really drawing on is a literature in cryptography. The idea that I was talking about there, where you can prove privately just isolated claims about yourself, that&#8217;s an idea called an <em>anonymous credential<\/em>. It allows you to be anonymous with respect to some kind of service provider while still proving a limited claim about yourself, like \u201cI am over 18,\u201d or in the case of personhood credentials, you prove, \u201cI am a person.\u201d So that&#8217;s all one body of literature. Then there&#8217;s this huge other body of literature and set of conversations happening in policy circles right now around what to do about AI. Huge questions abounding. Shrey and I have written a prior paper called \u201cContextual Confidence and Generative AI,\u201d which we talked about on this podcast, as well, and in that paper, we offered a framework for thinking about the specific ways that generative AI, sort of, threatens the foundations of our modes of communication online. And we outlined about 16 different solutions that could help us to solve the coming problems that generative AI might bring to our online ecosystems. And what we decided to do in this paper was focus on a set of solutions that we thought are not getting enough attention in those AI and AI policy circles. And so part of what this paper is doing is bringing together these ideas from this long body of work in cryptography <em>into<\/em> those conversations.<\/p>\n\n\n\n<p><strong>TINGLE:<\/strong> I&#8217;d like to know more about your methodology, Shrey. How did your team go about conducting this research?<\/p>\n\n\n\n<p><strong>JAIN:<\/strong> So we had a wide range of collaborators from industry, academia, the civil sector who work on topics of digital identity, privacy, advocacy, security, and AI policy which came together to think about, what is the clearest way in which we can explain what we believe is a countermeasure that can protect against AI-powered deception that, from a technological point of view, there&#8217;s already a large body of work that we can reference but from a \u201chow this can be implemented.\u201d Discussing the tradeoffs that various different types of academics and industry leaders are thinking about. Can we communicate that very clearly? And so the methodology here was really about bringing together a wide range of collaborators to really bridge these two bodies of work together and communicate it clearly\u2014not just the technical solutions but also the tradeoffs.<\/p>\n\n\n\n<p><strong>TINGLE:<\/strong> So, Zo\u00eb, what are the major findings here, and how are they presented in the paper?<\/p>\n\n\n\n<p><strong>HITZIG:<\/strong> I am an economist by training. Economists love to talk about tradeoffs. You know, when you have some of <em>this<\/em>, it means you have a little bit less of <em>that<\/em>. It&#8217;s kind of like the whole business of economics. And a key finding of the paper, as I see it, is that we begin with what feels like a tradeoff, which is on the one hand, as Shrey was saying, we want to be able to be anonymous online because that has great benefits. It means we can speak truth to power. It means we can protect civil liberties and invite everyone into online spaces. You know, privacy is a core feature of the internet. And at the same time, the, kind of, other side of the tradeoff that we&#8217;re often presented is, well, if you want all that privacy and anonymity, it means that you can&#8217;t have accountability. There&#8217;s no way of tracking down the bad actors and making sure that they don&#8217;t do something bad again. And we&#8217;re presented with this tradeoff between anonymity on the one hand and accountability on the other hand. All that is to say, a key finding of this paper, as I see it, is that personhood credentials and more generally this class of anonymous credentials that allow you to prove different pieces of your identity online without revealing your entire identity actually allow you to evade the tradeoff and allow you to, in some sense, have your cake and eat it, too. What it allows us to do is to create <em>some<\/em> accountability, to put back some way of tracing people&#8217;s digital activities to an <em>accountable<\/em> entity. What we also present in the paper are a number of different, sort of, key challenges that will have to be taken into account in building any kind of system like this. But we present all of that, all of those challenges going forward, as potentially very worth grappling with because of the potential for this, sort of, idea to allow us to preserve the internet&#8217;s commitment to privacy, free speech, and anonymity while also creating accountability for harm.<\/p>\n\n\n\n<p><strong>TINGLE:<\/strong> So Zo\u00eb mentioned some of these tradeoffs. Let&#8217;s talk a little bit more about real-world impact, Shrey. Who benefits most from this work?<\/p>\n\n\n\n<p><strong>JAIN: <\/strong>I think there&#8217;s many different people that benefit. One is anyone who&#8217;s communicating or doing anything online in that they can have more confidence in their interactions. And it, kind of, builds back on the paper that Zo\u00eb and I wrote last year on contextual confidence and generative AI, which is that we want to have confidence in our interactions, and in order to do that, one component is being able to identify <em>who<\/em> you&#8217;re speaking with and also doing it in a privacy-preserving way. And I think another person who benefits is policymakers. I think today, when we think about the language and technologies that are being promoted, this complements a lot of the existing work that&#8217;s being done on provenance and watermarking. And I think the ability for those individuals to be successful in their mission, which is creating a safer online space, this work can help guide these individuals to be more effective in their mission in that it highlights a technology that is not currently as discussed comparatively to these other solutions and complements them in order to protect online communication. <\/p>\n\n\n\n<p><strong>HITZIG:<\/strong> You know, social media is flooded with bots, and sometimes the problem with bots is that they&#8217;re posting fake content, but other times, the problem with bots is that there are just so many of them and they&#8217;re all retweeting each other and it&#8217;s very hard to tell what&#8217;s real. And so what a personhood credential can do is say, you know, maybe each person is only allowed to have five accounts on a particular social media platform.<\/p>\n\n\n\n<p><strong>TINGLE: <\/strong>So, Shrey, what&#8217;s next on your research agenda? Are there lingering questions\u2014I know there are\u2014and key challenges here, and if so, how do you hope to answer them?<\/p>\n\n\n\n<p><strong>JAIN:<\/strong> We believe we&#8217;ve aggregated a strong set of industry, academic, and, you know, civil sector collaborators, but we&#8217;re only a small subset of the people who are going to be interacting with these systems. And so the first area of next steps is to gather feedback about the proposal of a solution that we&#8217;ve had and how can we improve that: are there tradeoffs that we&#8217;re missing? Are there technical components that we weren&#8217;t thinking as deeply through? And I think there&#8217;s a lot of narrow open questions that come out of this. For instance, how do personhood credentials relate to existing laws regarding identity theft or protection laws? In areas where service providers can&#8217;t require government IDs, how does that apply to personhood credentials that rely on government IDs? I think that there&#8217;s a lot of these open questions that we address in the paper that I think need more experimentation and thinking through but also a lot of empirical work to be done. How do people react to personhood credentials, and does it actually enhance confidence in their interactions online? I think that there&#8217;s a lot of open questions on the actual effectiveness of these tools. And so I think there&#8217;s a large area of work to be done there, as well.<\/p>\n\n\n\n<p><strong>HITZIG: <\/strong>I&#8217;ve been thinking a lot about the early days of the internet. I wasn&#8217;t around for that, but I know that every little decision that was made in a very short period of time had incredibly lasting consequences that we&#8217;re still dealing with now. There&#8217;s an enormous path dependence in every kind of technology. And I feel that right now, we&#8217;re in that period of time, the small window where generative AI is this new thing to contend with, and it&#8217;s uprooting many of our assumptions about how our systems can work or should work. And I&#8217;m trying to think about how to set up those institutions, make these tiny decisions <em>right<\/em> so that in the future we have a digital architecture that&#8217;s really serving the goals that we want it to serve.<\/p>\n\n\n\n<p>[MUSIC]<\/p>\n\n\n\n<p><strong>TINGLE: <\/strong>Very thoughtful. With that, Shrey Jain, Zo\u00eb Hitzig, thank you so much for joining us today.<\/p>\n\n\n\n<p><strong>HITZIG:<\/strong> Thank you so much, Amber.<\/p>\n\n\n\n<p><strong>TINGLE:<\/strong> And thanks to our listeners, as well. If you&#8217;d like to learn more about Shrey and Zo\u00eb&#8217;s work on personhood credentials and advanced AI, you&#8217;ll find a link to this paper at aka.ms\/abstracts, or you can read it on arXiv. Thanks again for tuning in. I&#8217;m Amber Tingle, and we hope you&#8217;ll join us next time on <em>Abstracts<\/em>.<\/p>\n\n\n\n<p>[MUSIC FADES]<\/p>\n\n\t\t\t\t<\/span>\n\t\t\t<\/div>\n\t\t\t<button\n\t\t\t\tclass=\"action-trigger glyph-prepend mt-2 mb-0 show-more-show-less-toggle\"\n\t\t\t\taria-expanded=\"false\"\n\t\t\t\tdata-show-less-text=\"Show less\"\n\t\t\t\ttype=\"button\"\n\t\t\t\taria-controls=\"show-more-show-less-toggle-1\"\n\t\t\t\taria-label=\"Show more content\"\n\t\t\t\tdata-alternate-aria-label=\"Show less content\">\n\t\t\t\tShow more\t\t\t<\/button>\n\t\t<\/div>\n\t<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Advanced AI may make it easier for bad actors to deceive others online. A multidisciplinary research team is exploring one solution: a credential that allows people to show they\u2019re not bots without sharing identifying information. Shrey Jain and Zo\u00eb Hitzig explain.<\/p>\n","protected":false},"author":37583,"featured_media":1079022,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"https:\/\/player.blubrry.com\/id\/134878498","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_hide_image_in_river":0,"footnotes":""},"categories":[240054],"tags":[],"research-area":[13556,13558],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[243990],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[268128],"class_list":["post-1073550","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-msr-podcast","msr-research-area-artificial-intelligence","msr-research-area-security-privacy-cryptography","msr-locale-en_us","msr-post-option-podcast-featured","msr-podcast-series-abstracts"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"https:\/\/player.blubrry.com\/id\/134878498","podcast_episode":"","msr_research_lab":[],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[881565],"related-projects":[],"related-events":[],"related-researchers":[{"type":"user_nicename","value":"Amber Tingle","user_id":42681,"display_name":"Amber Tingle","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/ambertingle\/?lang=zh-cn\" aria-label=\"\u8bbf\u95ee\u4e2a\u4eba\u8d44\u6599\u9875\u9762\u4e86\u89e3Amber Tingle\">Amber Tingle<\/a>","is_active":false,"last_first":"Tingle, Amber","people_section":0,"alias":"ambertingle"},{"type":"guest","value":"zo-hitzig","user_id":"985560","display_name":"Zo&euml; Hitzig","author_link":"<a href=\"http:\/\/www.zoehitzig.com\/\" aria-label=\"\u8bbf\u95ee\u4e2a\u4eba\u8d44\u6599\u9875\u9762\u4e86\u89e3Zo&euml; Hitzig\">Zo&euml; Hitzig<\/a>","is_active":true,"last_first":"Hitzig, Zo&euml;","people_section":0,"alias":"zo-hitzig"}],"msr_type":"Post","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-960x540.jpg\" class=\"img-object-cover\" alt=\"Stylized microphone and sound waves illustration.\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/08\/Episode-15_Abstracts_Hero_Feature_No_Text_1400x788.jpg 1400w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","byline":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/ambertingle\/\" title=\"Go to researcher profile for Amber Tingle\" aria-label=\"Go to researcher profile for Amber Tingle\" data-bi-type=\"byline author\" data-bi-cN=\"Amber Tingle\">Amber Tingle<\/a>, Shrey Jain, and <a href=\"http:\/\/www.zoehitzig.com\/\" title=\"Go to researcher profile for Zo&euml; Hitzig\" aria-label=\"Go to researcher profile for Zo&euml; Hitzig\" data-bi-type=\"byline author\" data-bi-cN=\"Zo&euml; Hitzig\">Zo&euml; Hitzig<\/a>","formattedDate":"August 15, 2024","formattedExcerpt":"Advanced AI may make it easier for bad actors to deceive others online. A multidisciplinary research team is exploring one solution: a credential that allows people to show they\u2019re not bots without sharing identifying information. Shrey Jain and Zo\u00eb Hitzig explain.","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/1073550","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/37583"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=1073550"}],"version-history":[{"count":29,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/1073550\/revisions"}],"predecessor-version":[{"id":1079025,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/1073550\/revisions\/1079025"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/1079022"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=1073550"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=1073550"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=1073550"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=1073550"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=1073550"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=1073550"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=1073550"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=1073550"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=1073550"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=1073550"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=1073550"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}