{"id":875019,"date":"2022-09-08T06:33:31","date_gmt":"2022-09-08T13:33:31","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-event&#038;p=875019"},"modified":"2022-11-21T08:34:11","modified_gmt":"2022-11-21T16:34:11","slug":"metaphors-for-human-ai-interaction-workshop","status":"publish","type":"msr-event","link":"https:\/\/www.microsoft.com\/en-us\/research\/event\/metaphors-for-human-ai-interaction-workshop\/","title":{"rendered":"Metaphors for Human-AI Interaction Workshop"},"content":{"rendered":"\n\n\n\n\n<p><strong>This is an invite-only workshop. Please do not forward.<\/strong><\/p>\n\n\n\n<p>Design for human-AI interaction has drawn on various metaphors, including the <em>collaborating partner<\/em>, the <em>helpful<\/em> <em>assistant<\/em> and the <em>co-pilot<\/em>. These metaphors tend to focus on <em>explicit<\/em> interactions between humans and AI. However, interactions between humans and intelligent systems are also <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3290605.3300647\" target=\"_blank\" rel=\"noopener noreferrer\">implicit<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, making it difficult for users to build mental models of what the system is doing or how it does it. In this workshop, we will explore an extended set of metaphors, with the aim of facilitating (i) design and (ii) user understanding of how people work both <em>with<\/em> and <em>through<\/em> AI systems, as they create content and data, both intentionally and through traces of activity.&nbsp;<\/p>\n\n\n\n\n\n<p>For instance, <a href=\"https:\/\/www.microsoft.com\/en-gb\/microsoft-viva\/topics\" target=\"_blank\" rel=\"noreferrer noopener\">Viva Topics<\/a> is an intelligent system that builds an organisational knowledge base from content generated by organisation members, and then disseminates this across the organisation. Interactions between AI and organisation members in this case are largely implicit, and the algorithms that build the knowledge base and highlight its content to other organisation members might be understood as <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.tandfonline.com\/doi\/abs\/10.1207\/s15327051hci2004_1\" target=\"_blank\" rel=\"noopener noreferrer\"><em>mediators<\/em><span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, in that they mediate interactions between people and the knowledge base, and also between people and other people by connecting them through content recommendations. Another relevant metaphor is that of <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/aisel.aisnet.org\/jais\/vol10\/iss5\/1\/\" target=\"_blank\" rel=\"noopener noreferrer\"><em>infrastructure<\/em><span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. The pervasive and background qualities of these systems resonate with other technological infrastructures that the HCI community has considered.<\/p>\n\n\n\n<p>Despite the infrastructural quality of Viva Topics, the output of the ML that underpins it can be foregrounded and directly edited by people. For instance, human-readable schema, produced by <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/alexandria-in-microsoft-viva-topics-from-big-data-to-big-knowledge\/\" target=\"_blank\" rel=\"noreferrer noopener\">probabilistic programming<\/a> techniques, can be <em>curated<\/em> by organisation members and are then stored as <em>stable<\/em> values in the knowledge base. These representations of knowledge fold into organisational work, by forming the basis of AI-enabled recommendations (e.g., of other organisation members who are knowledgeable about a topic, or of relevant resources). In contrast, ML outputs produced by neural embedding based ML models are <em>fluid<\/em>, being produced in response to user queries in the moment. Deep neural ML is often associated with partnership experiences such as <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/features\/copilot\/\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub Copilot<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>. While this interaction is, in many ways, explicit, it also has implicit qualities, in that human input informs the ML in ways that are not visible to its users.&nbsp;<\/p>\n\n\n\n<p>Thus, different ML technologies have different implications for how metaphors can support users, designers and developers in understanding and creating intelligent systems. These metaphors may speak to both implicit and explicit qualities of interactions between people and the same ML technology. <\/p>\n\n\n\n<p>In this workshop, we will explore the idea that expanding the repertoire of metaphors employed when developing ML systems and communicating their properties to users could: <\/p>\n\n\n\n<ul class=\"wp-block-list\" id=\"block-92a022a0-a025-46bb-b595-fe57efa3d063\"><li>Support the users of intelligent systems in understanding how, through their activity and interactions, they are impacting and being impacted by algorithms, thus allowing opportunities for agency and repair;&nbsp;<\/li><li>Support designers who create user experiences that incorporate human-AI interactions in (i) understanding and articulating the nature of those interactions and (ii) making systems more transparent and explainable to their users;&nbsp;<\/li><li>Support decision-makers who design and develop ML systems in understanding the implications of building software that incorporates different ML technologies, especially in terms of their potential to make ML outputs human-readable, curatable, stable, or otherwise potentially capable of serving as a \u2018<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.jstor.org\/stable\/285080\" target=\"_blank\" rel=\"noopener noreferrer\">boundary object<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>\u2019 between humans and AI.&nbsp;<\/li><\/ul>\n\n\n\n\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 id=\"speakers\">Speakers<\/h2>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-top is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/pure.au.dk\/portal\/en\/persons\/susanne-boedker(87d4fbb6-b38c-449e-b87d-59f693b7d6f0).html\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"150\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/susanne-bodker_circle-150x150.jpg\" alt=\"headshot of Susanne B\u00f8dker\" class=\"wp-image-875841\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/susanne-bodker_circle-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/susanne-bodker_circle-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/susanne-bodker_circle-180x180.jpg 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/susanne-bodker_circle.jpg 360w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a><\/figure>\n\n\n\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h5 id=\"susanne-bodker-opens-in-new-tab\" class=\"has-text-align-center\"><a href=\"https:\/\/pure.au.dk\/portal\/en\/persons\/susanne-boedker(87d4fbb6-b38c-449e-b87d-59f693b7d6f0).html\" target=\"_blank\" rel=\"noreferrer noopener\">Susanne B\u00f8dker<\/a><\/h5>\n\n\n\n<p class=\"has-text-align-center\">Professor<br><em>Aarhus University<\/em><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/www.eca.ed.ac.uk\/profile\/ewa-luger\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"150\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/ewa-luger_circle-150x150.jpg\" alt=\"headshot of Ewa Luger\" class=\"wp-image-875847\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/ewa-luger_circle-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/ewa-luger_circle-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/ewa-luger_circle-180x180.jpg 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/ewa-luger_circle.jpg 360w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a><\/figure>\n\n\n\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h5 id=\"ewa-luger-opens-in-new-tab\" class=\"has-text-align-center\"><a href=\"https:\/\/www.eca.ed.ac.uk\/profile\/ewa-luger\" target=\"_blank\" rel=\"noreferrer noopener\">Ewa Luger<\/a><\/h5>\n\n\n\n<p class=\"has-text-align-center\">Professor of Human-Data Interaction<br><em>University of Edinburgh<\/em><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/githubnext.com\/team\/acr31\/\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"150\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/andrew-rice_circle-150x150.jpg\" alt=\"headshot of Andrew Rice\" class=\"wp-image-875844\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/andrew-rice_circle-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/andrew-rice_circle-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/andrew-rice_circle-180x180.jpg 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/andrew-rice_circle.jpg 360w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a><\/figure>\n\n\n\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h5 id=\"andrew-rice-opens-in-new-tab\" class=\"has-text-align-center\"><a href=\"https:\/\/githubnext.com\/team\/acr31\/\" target=\"_blank\" rel=\"noreferrer noopener\">Andrew Rice<\/a><\/h5>\n\n\n\n<p class=\"has-text-align-center\">Principal Researcher<br><em>GitHub<\/em><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/uclic.ucl.ac.uk\/people\/yvonne-rogers\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"150\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yvonne-rogers_circle-150x150.jpg\" alt=\"headshot of Yvonne Roger\" class=\"wp-image-875856\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yvonne-rogers_circle-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yvonne-rogers_circle-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yvonne-rogers_circle-180x180.jpg 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yvonne-rogers_circle.jpg 360w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a><\/figure>\n\n\n\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h5 id=\"yvonne-rogers-opens-in-new-tab\" class=\"has-text-align-center\"><a href=\"https:\/\/uclic.ucl.ac.uk\/people\/yvonne-rogers\" target=\"_blank\" rel=\"noreferrer noopener\">Yvonne Rogers<\/a><\/h5>\n\n\n\n<p class=\"has-text-align-center\">Professor and Director of UCLIC<br><em>UCL<\/em><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/nuryildirim.github.io\/\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"150\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/nur-yildirim_circle-150x150.jpg\" alt=\"headshot of Nur Yildirim\" class=\"wp-image-875862\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/nur-yildirim_circle-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/nur-yildirim_circle-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/nur-yildirim_circle-180x180.jpg 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/nur-yildirim_circle.jpg 360w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a><\/figure>\n\n\n\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h5 id=\"nur-yildirim-opens-in-new-tab\" class=\"has-text-align-center\"><a href=\"https:\/\/nuryildirim.github.io\/\" target=\"_blank\" rel=\"noreferrer noopener\">Nur Yildirim<\/a><\/h5>\n\n\n\n<p class=\"has-text-align-center\">PhD student<br><em>Carnegie Mellon University<\/em><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"150\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yordanz_circle-150x150.jpg\" alt=\"headshot of Yordan Zaykov\" class=\"wp-image-875868\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yordanz_circle-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yordanz_circle-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yordanz_circle-180x180.jpg 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yordanz_circle.jpg 360w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a><\/figure>\n\n\n\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h5 id=\"yordan-zaykov\" class=\"has-text-align-center\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\" target=\"_blank\" rel=\"noreferrer noopener\">Yordan Zaykov<\/a><\/h5>\n\n\n\n<p class=\"has-text-align-center\">Principal Research Engineering Manager<br><em>Microsoft Research<\/em><\/p>\n<\/div>\n<\/div>\n\n\n\n<div style=\"height:15px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 id=\"agenda\">Agenda<\/h2>\n\n\n\n<figure class=\"wp-block-table\"><table><thead><tr><th>Time (BST)<\/th><th>Session<\/th><\/tr><\/thead><tbody><tr><td>10:00<\/td><td>Opening remarks and framing<\/td><\/tr><tr><td>10:15<\/td><td><strong>Session 1: Interacting with intelligent systems<br><\/strong>We will consider the scope of interactions between end-users and AI and identify metaphors that help articulate and explain these. We will explore both existing metaphors and spaces where new metaphors are needed, and consider associated values and challenges.\u00a0<br><br>The discussion will be seeded by three 10-minute lightening talks that cover different ways of thinking about human-AI interaction:\u00a0<br>\u2022 Metaphors for machine learning: partners, tools, or companions? (<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/uclic.ucl.ac.uk\/people\/yvonne-rogers\" target=\"_blank\" rel=\"noopener noreferrer\">Yvonne Rogers<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>)\u00a0| <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/vOFcFs0RuQw\" target=\"_blank\" rel=\"noopener noreferrer\">video<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><br>\u2022 Technologies as tools\/mediators (<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/pure.au.dk\/portal\/en\/persons\/susanne-boedker(87d4fbb6-b38c-449e-b87d-59f693b7d6f0).html\" target=\"_blank\" rel=\"noopener noreferrer\">Susanne B\u00f8dker<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>)\u00a0| <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/7iJKjNPvDlw\" target=\"_blank\" rel=\"noopener noreferrer\">video<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><br>\u2022 AI as an educator (<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/www.eca.ed.ac.uk\/profile\/ewa-luger\" target=\"_blank\" rel=\"noopener noreferrer\">Ewa Luger<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>)\u00a0<br><br>This will be followed by breakout sessions (~30m) in which attendees will discuss different metaphors in more depth (after self-selecting metaphors of interest) and consider their utility in supporting user experience and understanding of intelligent systems design, alongside the challenges they raise.<\/td><\/tr><tr><td>11:15<\/td><td><em>Short break<\/em><\/td><\/tr><tr><td>11:30<\/td><td><strong>Report back and discussion<\/strong><\/td><\/tr><tr><td>12:15<\/td><td><em>Lunch break<\/em><\/td><\/tr><tr><td>13:15<\/td><td><strong>Session 2: Interacting with ML-mined data<br><\/strong>We will consider how metaphors could play a role in supporting the designers, developers and decision-makers that create intelligent systems in understanding how the ML technologies they use have implications for how people can interact with ML outputs, due to the ways in which it is generated, represented, and can be made visible to or editable by humans.\u00a0\u00a0<br><br>The discussion will be seeded by three 10-minute lightening talks that highlight the complexities of designing for AI systems, and some of the differences between different ML technologies:\u00a0<br>\u2022 Designing Human-AI Interaction (<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/nuryildirim.github.io\/\" target=\"_blank\" rel=\"noopener noreferrer\">Nur Yildirim<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>)\u00a0| <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/THQlOwyH8rU\" target=\"_blank\" rel=\"noopener noreferrer\">video<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><br>\u2022 Enterprise foundation model of knowledge (<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\" target=\"_blank\" rel=\"noreferrer noopener\">Yordan Zaykov<\/a>)\u00a0| <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/qL7bofG3GLQ\" target=\"_blank\" rel=\"noopener noreferrer\">video<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><br>\u2022 Explicit and Implicit User-Interaction with Github Copilot (<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/githubnext.com\/team\/acr31\/\" target=\"_blank\" rel=\"noopener noreferrer\">Andrew Rice<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>)\u00a0<br><br>This will be followed by breakout sessions (~30m) in which attendees will consider how designers, developers and decision-makers can be supported in understanding how the deployment of different ML technologies has different implications for supporting user understanding of what those models are doing and how people can interact with them.<\/td><\/tr><tr><td>14:20<\/td><td><em>Short break<\/em><\/td><\/tr><tr><td>14:35<\/td><td><strong>Report back and discussion<\/strong><\/td><\/tr><tr><td>15:20<\/td><td>Closing remarks<\/td><\/tr><tr><td>15:30<\/td><td>End<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 id=\"workshop-organizers\">Workshop organizers<\/h2>\n\n\n\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/sianl\/\">Si\u00e2n Lindley<\/a>, Microsoft Research Cambridge<br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\">Yordan Zaykov<\/a>, Microsoft Research Cambridge<br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/t-idal\/\">Ida Larsen-Ledet<\/a>, Microsoft Research Cambridge<br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/brburlin\/\">Britta Burlin<\/a>, Microsoft Research Cambridge<\/p>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 id=\"microsofts-event-code-of-conduct\">Microsoft\u2019s Event Code of Conduct<\/h4>\n\n\n\n<p>Microsoft\u2019s mission is to empower every person and every organization on the planet to achieve more. This includes events Microsoft hosts and participates in, where we seek to create a respectful, friendly, and inclusive experience for all participants. As such, we do not tolerate harassing or disrespectful behavior, messages, images, or interactions by any event participant, in any form, at any aspect of the program including business and social activities, regardless of location. <\/p>\n\n\n\n<p>We do not tolerate any behavior that is degrading to any gender, race, sexual orientation or disability, or any behavior that would violate <a href=\"https:\/\/www.microsoft.com\/en-us\/legal\/compliance\/default.aspx\" target=\"_blank\" rel=\"noreferrer noopener\">Microsoft\u2019s Anti-Harassment and Anti-Discrimination Policy, Equal Employment Opportunity Policy, or&nbsp;Standards of Business Conduct<\/a>. In short, the entire experience at the venue must meet our culture standards. We encourage everyone to assist in creating a welcoming and safe environment. Please <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/aka.ms\/reportconcern\" target=\"_blank\" rel=\"noopener noreferrer\">report<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> any concerns, harassing behavior, or suspicious or disruptive activity to venue staff, the event host or owner, or event staff. Microsoft reserves the right to refuse admittance to or remove any person from company-sponsored events at any time in its sole discretion.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--1\"><a data-bi-type=\"button\" class=\"wp-block-button__link\" href=\"https:\/\/aka.ms\/reportconcern\" target=\"_blank\" rel=\"noreferrer noopener\">Report a concern<\/a><\/div>\n<\/div>\n\n\n","protected":false},"excerpt":{"rendered":"<p>This is an invite-only workshop. Please do not forward. Design for human-AI interaction has drawn on various metaphors, including the collaborating partner, the helpful assistant and the co-pilot. These metaphors tend to focus on explicit interactions between humans and AI. However, interactions between humans and intelligent systems are also implicit (opens in new tab), making [&hellip;]<\/p>\n","protected":false},"featured_media":874611,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_startdate":"2022-10-26","msr_enddate":"","msr_location":"Virtual | Cambridge, UK","msr_expirationdate":"","msr_event_recording_link":"","msr_event_link":"","msr_event_link_redirect":false,"msr_event_time":"10:00\u201315:30  BST","msr_hide_region":false,"msr_private_event":true,"msr_hide_image_in_river":0,"footnotes":""},"research-area":[13556,13554],"msr-region":[],"msr-event-type":[],"msr-video-type":[],"msr-locale":[268875],"msr-program-audience":[],"msr-post-option":[],"msr-impact-theme":[],"class_list":["post-875019","msr-event","type-msr-event","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_about":"<!-- wp:msr\/event-details {\"title\":\"Metaphors for Human-AI Interaction Workshop\",\"image\":{\"id\":874611,\"url\":\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B.jpg\",\"alt\":\"Abstract image with blue, purple, and orange tiles moving upward\"}} \/-->\n\n<!-- wp:msr\/content-tabs -->\n<!-- wp:msr\/content-tab -->\n<!-- wp:paragraph -->\n<p><strong>This is an invite-only workshop. Please do not forward.<\/strong><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Design for human-AI interaction has drawn on various metaphors, including the <em>collaborating partner<\/em>, the <em>helpful<\/em> <em>assistant<\/em> and the <em>co-pilot<\/em>. These metaphors tend to focus on <em>explicit<\/em> interactions between humans and AI. However, interactions between humans and intelligent systems are also <a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3290605.3300647\" target=\"_blank\" rel=\"noreferrer noopener\">implicit<\/a>, making it difficult for users to build mental models of what the system is doing or how it does it. In this workshop, we will explore an extended set of metaphors, with the aim of facilitating (i) design and (ii) user understanding of how people work both <em>with<\/em> and <em>through<\/em> AI systems, as they create content and data, both intentionally and through traces of activity.&nbsp;<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:moray\/accordion -->\n<!-- wp:moray\/accordion-item {\"title\":\"Read more\"} -->\n<!-- wp:paragraph {\"placeholder\":\"Write content\u2026\"} -->\n<p>For instance, <a href=\"https:\/\/www.microsoft.com\/en-gb\/microsoft-viva\/topics\" target=\"_blank\" rel=\"noreferrer noopener\">Viva Topics<\/a> is an intelligent system that builds an organisational knowledge base from content generated by organisation members, and then disseminates this across the organisation. Interactions between AI and organisation members in this case are largely implicit, and the algorithms that build the knowledge base and highlight its content to other organisation members might be understood as <a href=\"https:\/\/www.tandfonline.com\/doi\/abs\/10.1207\/s15327051hci2004_1\" target=\"_blank\" rel=\"noreferrer noopener\"><em>mediators<\/em><\/a>, in that they mediate interactions between people and the knowledge base, and also between people and other people by connecting them through content recommendations. Another relevant metaphor is that of <a href=\"https:\/\/aisel.aisnet.org\/jais\/vol10\/iss5\/1\/\" target=\"_blank\" rel=\"noreferrer noopener\"><em>infrastructure<\/em><\/a>. The pervasive and background qualities of these systems resonate with other technological infrastructures that the HCI community has considered.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph {\"placeholder\":\"Write content\u2026\"} -->\n<p>Despite the infrastructural quality of Viva Topics, the output of the ML that underpins it can be foregrounded and directly edited by people. For instance, human-readable schema, produced by <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/blog\/alexandria-in-microsoft-viva-topics-from-big-data-to-big-knowledge\/\" target=\"_blank\" rel=\"noreferrer noopener\">probabilistic programming<\/a> techniques, can be <em>curated<\/em> by organisation members and are then stored as <em>stable<\/em> values in the knowledge base. These representations of knowledge fold into organisational work, by forming the basis of AI-enabled recommendations (e.g., of other organisation members who are knowledgeable about a topic, or of relevant resources). In contrast, ML outputs produced by neural embedding based ML models are <em>fluid<\/em>, being produced in response to user queries in the moment. Deep neural ML is often associated with partnership experiences such as <a href=\"https:\/\/github.com\/features\/copilot\/\" target=\"_blank\" rel=\"noreferrer noopener\">GitHub Copilot<\/a>. While this interaction is, in many ways, explicit, it also has implicit qualities, in that human input informs the ML in ways that are not visible to its users.&nbsp;<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Thus, different ML technologies have different implications for how metaphors can support users, designers and developers in understanding and creating intelligent systems. These metaphors may speak to both implicit and explicit qualities of interactions between people and the same ML technology. <\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>In this workshop, we will explore the idea that expanding the repertoire of metaphors employed when developing ML systems and communicating their properties to users could: <\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:list -->\n<ul id=\"block-92a022a0-a025-46bb-b595-fe57efa3d063\"><li>Support the users of intelligent systems in understanding how, through their activity and interactions, they are impacting and being impacted by algorithms, thus allowing opportunities for agency and repair;&nbsp;<\/li><li>Support designers who create user experiences that incorporate human-AI interactions in (i) understanding and articulating the nature of those interactions and (ii) making systems more transparent and explainable to their users;&nbsp;<\/li><li>Support decision-makers who design and develop ML systems in understanding the implications of building software that incorporates different ML technologies, especially in terms of their potential to make ML outputs human-readable, curatable, stable, or otherwise potentially capable of serving as a \u2018<a href=\"https:\/\/www.jstor.org\/stable\/285080\" target=\"_blank\" rel=\"noreferrer noopener\">boundary object<\/a>\u2019 between humans and AI.&nbsp;<\/li><\/ul>\n<!-- \/wp:list -->\n<!-- \/wp:moray\/accordion-item -->\n<!-- \/wp:moray\/accordion -->\n\n<!-- wp:spacer {\"height\":\"40px\"} -->\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading -->\n<h2>Speakers<\/h2>\n<!-- \/wp:heading -->\n\n<!-- wp:columns {\"verticalAlignment\":\"top\"} -->\n<div class=\"wp-block-columns are-vertically-aligned-top\"><!-- wp:column {\"verticalAlignment\":\"top\"} -->\n<div class=\"wp-block-column is-vertically-aligned-top\"><!-- wp:image {\"id\":875841,\"sizeSlug\":\"thumbnail\",\"linkDestination\":\"custom\"} -->\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/pure.au.dk\/portal\/en\/persons\/susanne-boedker(87d4fbb6-b38c-449e-b87d-59f693b7d6f0).html\" target=\"_blank\" rel=\"noreferrer noopener\"><img src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/susanne-bodker_circle-150x150.jpg\" alt=\"headshot of Susanne B\u00f8dker\" class=\"wp-image-875841\"\/><\/a><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:spacer {\"height\":\"7px\"} -->\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading {\"textAlign\":\"center\",\"level\":5} -->\n<h5 class=\"has-text-align-center\"><a href=\"https:\/\/pure.au.dk\/portal\/en\/persons\/susanne-boedker(87d4fbb6-b38c-449e-b87d-59f693b7d6f0).html\" target=\"_blank\" rel=\"noreferrer noopener\">Susanne B\u00f8dker<\/a><\/h5>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph {\"align\":\"center\"} -->\n<p class=\"has-text-align-center\">Professor<br><em>Aarhus University<\/em><\/p>\n<!-- \/wp:paragraph --><\/div>\n<!-- \/wp:column -->\n\n<!-- wp:column {\"verticalAlignment\":\"top\"} -->\n<div class=\"wp-block-column is-vertically-aligned-top\"><!-- wp:image {\"id\":875847,\"sizeSlug\":\"thumbnail\",\"linkDestination\":\"custom\"} -->\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/www.eca.ed.ac.uk\/profile\/ewa-luger\" target=\"_blank\" rel=\"noreferrer noopener\"><img src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/ewa-luger_circle-150x150.jpg\" alt=\"headshot of Ewa Luger\" class=\"wp-image-875847\"\/><\/a><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:spacer {\"height\":\"7px\"} -->\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading {\"textAlign\":\"center\",\"level\":5} -->\n<h5 class=\"has-text-align-center\"><a href=\"https:\/\/www.eca.ed.ac.uk\/profile\/ewa-luger\" target=\"_blank\" rel=\"noreferrer noopener\">Ewa Luger<\/a><\/h5>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph {\"align\":\"center\"} -->\n<p class=\"has-text-align-center\">Professor of Human-Data Interaction<br><em>University of Edinburgh<\/em><\/p>\n<!-- \/wp:paragraph --><\/div>\n<!-- \/wp:column -->\n\n<!-- wp:column {\"verticalAlignment\":\"top\"} -->\n<div class=\"wp-block-column is-vertically-aligned-top\"><!-- wp:image {\"id\":875844,\"sizeSlug\":\"thumbnail\",\"linkDestination\":\"custom\"} -->\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/githubnext.com\/team\/acr31\/\" target=\"_blank\" rel=\"noreferrer noopener\"><img src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/andrew-rice_circle-150x150.jpg\" alt=\"headshot of Andrew Rice\" class=\"wp-image-875844\"\/><\/a><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:spacer {\"height\":\"7px\"} -->\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading {\"textAlign\":\"center\",\"level\":5} -->\n<h5 class=\"has-text-align-center\"><a href=\"https:\/\/githubnext.com\/team\/acr31\/\" target=\"_blank\" rel=\"noreferrer noopener\">Andrew Rice<\/a><\/h5>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph {\"align\":\"center\"} -->\n<p class=\"has-text-align-center\">Principal Researcher<br><em>GitHub<\/em><\/p>\n<!-- \/wp:paragraph --><\/div>\n<!-- \/wp:column -->\n\n<!-- wp:column {\"verticalAlignment\":\"top\"} -->\n<div class=\"wp-block-column is-vertically-aligned-top\"><!-- wp:image {\"id\":875856,\"sizeSlug\":\"thumbnail\",\"linkDestination\":\"custom\"} -->\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/uclic.ucl.ac.uk\/people\/yvonne-rogers\" target=\"_blank\" rel=\"noreferrer noopener\"><img src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yvonne-rogers_circle-150x150.jpg\" alt=\"headshot of Yvonne Roger\" class=\"wp-image-875856\"\/><\/a><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:spacer {\"height\":\"7px\"} -->\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading {\"textAlign\":\"center\",\"level\":5} -->\n<h5 class=\"has-text-align-center\"><a href=\"https:\/\/uclic.ucl.ac.uk\/people\/yvonne-rogers\" target=\"_blank\" rel=\"noreferrer noopener\">Yvonne Rogers<\/a><\/h5>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph {\"align\":\"center\"} -->\n<p class=\"has-text-align-center\">Professor and Director of UCLIC<br><em>UCL<\/em><\/p>\n<!-- \/wp:paragraph --><\/div>\n<!-- \/wp:column -->\n\n<!-- wp:column {\"verticalAlignment\":\"top\"} -->\n<div class=\"wp-block-column is-vertically-aligned-top\"><!-- wp:image {\"id\":875862,\"sizeSlug\":\"thumbnail\",\"linkDestination\":\"custom\"} -->\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/nuryildirim.github.io\/\" target=\"_blank\" rel=\"noreferrer noopener\"><img src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/nur-yildirim_circle-150x150.jpg\" alt=\"headshot of Nur Yildirim\" class=\"wp-image-875862\"\/><\/a><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:spacer {\"height\":\"7px\"} -->\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading {\"textAlign\":\"center\",\"level\":5} -->\n<h5 class=\"has-text-align-center\"><a href=\"https:\/\/nuryildirim.github.io\/\" target=\"_blank\" rel=\"noreferrer noopener\">Nur Yildirim<\/a><\/h5>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph {\"align\":\"center\"} -->\n<p class=\"has-text-align-center\">PhD student<br><em>Carnegie Mellon University<\/em><\/p>\n<!-- \/wp:paragraph --><\/div>\n<!-- \/wp:column -->\n\n<!-- wp:column {\"verticalAlignment\":\"top\"} -->\n<div class=\"wp-block-column is-vertically-aligned-top\"><!-- wp:image {\"id\":875868,\"sizeSlug\":\"thumbnail\",\"linkDestination\":\"custom\"} -->\n<figure class=\"wp-block-image size-thumbnail\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\" target=\"_blank\" rel=\"noreferrer noopener\"><img src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/yordanz_circle-150x150.jpg\" alt=\"headshot of Yordan Zaykov\" class=\"wp-image-875868\"\/><\/a><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:spacer {\"height\":\"7px\"} -->\n<div style=\"height:7px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading {\"textAlign\":\"center\",\"level\":5} -->\n<h5 class=\"has-text-align-center\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\" target=\"_blank\" rel=\"noreferrer noopener\">Yordan Zaykov<\/a><\/h5>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph {\"align\":\"center\"} -->\n<p class=\"has-text-align-center\">Principal Research Engineering Manager<br><em>Microsoft Research<\/em><\/p>\n<!-- \/wp:paragraph --><\/div>\n<!-- \/wp:column --><\/div>\n<!-- \/wp:columns -->\n\n<!-- wp:spacer {\"height\":\"15px\"} -->\n<div style=\"height:15px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading -->\n<h2>Agenda<\/h2>\n<!-- \/wp:heading -->\n\n<!-- wp:table -->\n<figure class=\"wp-block-table\"><table><thead><tr><th>Time (BST)<\/th><th>Session<\/th><\/tr><\/thead><tbody><tr><td>10:00<\/td><td>Opening remarks and framing<\/td><\/tr><tr><td>10:15<\/td><td><strong>Session 1: Interacting with intelligent systems<br><\/strong>We will consider the scope of interactions between end-users and AI and identify metaphors that help articulate and explain these. We will explore both existing metaphors and spaces where new metaphors are needed, and consider associated values and challenges.\u00a0<br><br>The discussion will be seeded by three 10-minute lightening talks that cover different ways of thinking about human-AI interaction:\u00a0<br>\u2022 Metaphors for machine learning: partners, tools, or companions? (<a href=\"https:\/\/uclic.ucl.ac.uk\/people\/yvonne-rogers\" target=\"_blank\" rel=\"noreferrer noopener\">Yvonne Rogers<\/a>)\u00a0| <a href=\"https:\/\/youtu.be\/vOFcFs0RuQw\" target=\"_blank\" rel=\"noreferrer noopener\">video<\/a><br>\u2022 Technologies as tools\/mediators (<a href=\"https:\/\/pure.au.dk\/portal\/en\/persons\/susanne-boedker(87d4fbb6-b38c-449e-b87d-59f693b7d6f0).html\" target=\"_blank\" rel=\"noreferrer noopener\">Susanne B\u00f8dker<\/a>)\u00a0| <a href=\"https:\/\/youtu.be\/7iJKjNPvDlw\" target=\"_blank\" rel=\"noreferrer noopener\">video<\/a><br>\u2022 AI as an educator (<a href=\"https:\/\/www.eca.ed.ac.uk\/profile\/ewa-luger\" target=\"_blank\" rel=\"noreferrer noopener\">Ewa Luger<\/a>)\u00a0<br><br>This will be followed by breakout sessions (~30m) in which attendees will discuss different metaphors in more depth (after self-selecting metaphors of interest) and consider their utility in supporting user experience and understanding of intelligent systems design, alongside the challenges they raise.<\/td><\/tr><tr><td>11:15<\/td><td><em>Short break<\/em><\/td><\/tr><tr><td>11:30<\/td><td><strong>Report back and discussion<\/strong><\/td><\/tr><tr><td>12:15<\/td><td><em>Lunch break<\/em><\/td><\/tr><tr><td>13:15<\/td><td><strong>Session 2: Interacting with ML-mined data<br><\/strong>We will consider how metaphors could play a role in supporting the designers, developers and decision-makers that create intelligent systems in understanding how the ML technologies they use have implications for how people can interact with ML outputs, due to the ways in which it is generated, represented, and can be made visible to or editable by humans.\u00a0\u00a0<br><br>The discussion will be seeded by three 10-minute lightening talks that highlight the complexities of designing for AI systems, and some of the differences between different ML technologies:\u00a0<br>\u2022 Designing Human-AI Interaction (<a href=\"https:\/\/nuryildirim.github.io\/\" target=\"_blank\" rel=\"noreferrer noopener\">Nur Yildirim<\/a>)\u00a0| <a href=\"https:\/\/youtu.be\/THQlOwyH8rU\" target=\"_blank\" rel=\"noreferrer noopener\">video<\/a><br>\u2022 Enterprise foundation model of knowledge (<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\" target=\"_blank\" rel=\"noreferrer noopener\">Yordan Zaykov<\/a>)\u00a0| <a href=\"https:\/\/youtu.be\/qL7bofG3GLQ\" target=\"_blank\" rel=\"noreferrer noopener\">video<\/a><br>\u2022 Explicit and Implicit User-Interaction with Github Copilot (<a href=\"https:\/\/githubnext.com\/team\/acr31\/\" target=\"_blank\" rel=\"noreferrer noopener\">Andrew Rice<\/a>)\u00a0<br><br>This will be followed by breakout sessions (~30m) in which attendees will consider how designers, developers and decision-makers can be supported in understanding how the deployment of different ML technologies has different implications for supporting user understanding of what those models are doing and how people can interact with them.<\/td><\/tr><tr><td>14:20<\/td><td><em>Short break<\/em><\/td><\/tr><tr><td>14:35<\/td><td><strong>Report back and discussion<\/strong><\/td><\/tr><tr><td>15:20<\/td><td>Closing remarks<\/td><\/tr><tr><td>15:30<\/td><td>End<\/td><\/tr><\/tbody><\/table><\/figure>\n<!-- \/wp:table -->\n\n<!-- wp:spacer {\"height\":\"40px\"} -->\n<div style=\"height:40px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading -->\n<h2>Workshop organizers<\/h2>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/sianl\/\">Si\u00e2n Lindley<\/a>, Microsoft Research Cambridge<br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/yordanz\/\">Yordan Zaykov<\/a>, Microsoft Research Cambridge<br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/t-idal\/\">Ida Larsen-Ledet<\/a>, Microsoft Research Cambridge<br><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/brburlin\/\">Britta Burlin<\/a>, Microsoft Research Cambridge<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:spacer {\"height\":\"30px\"} -->\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n<!-- \/wp:spacer -->\n\n<!-- wp:heading {\"level\":4} -->\n<h4>Microsoft\u2019s Event Code of Conduct<\/h4>\n<!-- \/wp:heading -->\n\n<!-- wp:paragraph -->\n<p>Microsoft\u2019s mission is to empower every person and every organization on the planet to achieve more. This includes events Microsoft hosts and participates in, where we seek to create a respectful, friendly, and inclusive experience for all participants. As such, we do not tolerate harassing or disrespectful behavior, messages, images, or interactions by any event participant, in any form, at any aspect of the program including business and social activities, regardless of location. <\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>We do not tolerate any behavior that is degrading to any gender, race, sexual orientation or disability, or any behavior that would violate <a href=\"https:\/\/www.microsoft.com\/en-us\/legal\/compliance\/default.aspx\" target=\"_blank\" rel=\"noreferrer noopener\">Microsoft\u2019s Anti-Harassment and Anti-Discrimination Policy, Equal Employment Opportunity Policy, or&nbsp;Standards of Business Conduct<\/a>. In short, the entire experience at the venue must meet our culture standards. We encourage everyone to assist in creating a welcoming and safe environment. Please <a href=\"https:\/\/aka.ms\/reportconcern\" target=\"_blank\" rel=\"noreferrer noopener\">report<\/a> any concerns, harassing behavior, or suspicious or disruptive activity to venue staff, the event host or owner, or event staff. Microsoft reserves the right to refuse admittance to or remove any person from company-sponsored events at any time in its sole discretion.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:buttons -->\n<div class=\"wp-block-buttons\"><!-- wp:button {\"className\":\"is-style-outline\"} -->\n<div class=\"wp-block-button is-style-outline\"><a class=\"wp-block-button__link\" href=\"https:\/\/aka.ms\/reportconcern\" target=\"_blank\" rel=\"noreferrer noopener\">Report a concern<\/a><\/div>\n<!-- \/wp:button --><\/div>\n<!-- \/wp:buttons -->\n<!-- \/wp:msr\/content-tab -->\n<!-- \/wp:msr\/content-tabs -->","tab-content":[],"msr_startdate":"2022-10-26","msr_enddate":"","msr_event_time":"10:00\u201315:30 BST","msr_location":"Virtual | Cambridge, UK","msr_event_link":"","msr_event_recording_link":"","msr_startdate_formatted":"October 26, 2022","msr_register_text":"Watch now","msr_cta_link":"","msr_cta_text":"","msr_cta_bi_name":"","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-960x540.jpg\" class=\"img-object-cover\" alt=\"Abstract image with blue, purple, and orange tiles moving upward\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-343x193.jpg 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/09\/WebsiteHero_1400x788_B.jpg 1400w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","event_excerpt":"This is an invite-only workshop. Please do not forward. Design for human-AI interaction has drawn on various metaphors, including the collaborating partner, the helpful assistant and the co-pilot. These metaphors tend to focus on explicit interactions between humans and AI. However, interactions between humans and intelligent systems are also implicit (opens in new tab), making it difficult for users to build mental models of what the system is doing or how it does it. In&hellip;","msr_research_lab":[],"related-researchers":[],"msr_impact_theme":[],"related-academic-programs":[],"related-groups":[],"related-projects":[],"related-opportunities":[],"related-publications":[],"related-videos":[],"related-posts":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/875019","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-event"}],"version-history":[{"count":28,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/875019\/revisions"}],"predecessor-version":[{"id":899793,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/875019\/revisions\/899793"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/874611"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=875019"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=875019"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=875019"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=875019"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=875019"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=875019"},{"taxonomy":"msr-program-audience","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-program-audience?post=875019"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=875019"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=875019"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}