{"id":1059168,"date":"2024-07-31T09:00:00","date_gmt":"2024-07-31T16:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=1059168"},"modified":"2024-07-31T09:25:04","modified_gmt":"2024-07-31T16:25:04","slug":"research-focus-week-of-july-29-2024","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/research-focus-week-of-july-29-2024\/","title":{"rendered":"Research Focus: Week of July 29, 2024"},"content":{"rendered":"\n<figure class=\"wp-block-pullquote\"><blockquote><p>Welcome to Research Focus, a series of blog posts that highlights notable publications, events, code\/datasets, new hires and other milestones from across the research community at Microsoft.<\/p><\/blockquote><\/figure>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1.jpg\" alt=\"Research Focus: July 22, 2024\" class=\"wp-image-1059216\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/figure>\n\n\n\n<h2 class=\"wp-block-heading h6 has-blue-color has-text-color has-link-color wp-elements-4a169d5f341c8d758a738249d4401b93\" id=\"new-research\">NEW RESEARCH<\/h2>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"scalable-differentiable-causal-discovery-in-the-presence-of-latent-confounders-with-skeleton-posterior\">Scalable Differentiable Causal Discovery in the Presence of Latent Confounders with Skeleton Posterior<\/h2>\n\n\n\n<p>Differentiable causal discovery has made significant advancements in the learning of directed acyclic graphs. However, its application to real-world datasets remains restricted due to the ubiquity of latent confounders and the requirement to learn maximal ancestral graphs (MAGs). Previous differentiable MAG learning algorithms have been limited to small datasets and failed to scale to larger ones (e.g., with more than 50 variables).<\/p>\n\n\n\n<p>In a recent paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/scalable-differentiable-causal-discovery-in-the-presence-of-latent-confounders-with-skeleton-posterior\/\">Scalable Differentiable Causal Discovery in the Presence of Latent Confounders with Skeleton Posterior<\/a>, researchers from Microsoft and external colleagues explore the potential for causal skeleton, which is the undirected version of the causal graph, to improve accuracy and reduce the search space of the optimization procedure, thereby enhancing the performance of differentiable causal discovery. They propose SPOT (Skeleton Posterior-guided OpTimization), a two-phase framework that harnesses skeleton posterior for differentiable causal discovery in the presence of latent confounders.<\/p>\n\n\n\n<p>Extensive experiments on various datasets show that SPOT substantially outperforms state-of-the-art methods for MAG learning. SPOT also demonstrates its effectiveness in the accuracy of skeleton posterior estimation in comparison with non-parametric bootstrap-based, or more recently, variational inference-based methods. The adoption of skeleton posterior exhibits strong promise in various causal discovery tasks.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--1\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/scalable-differentiable-causal-discovery-in-the-presence-of-latent-confounders-with-skeleton-posterior\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dots\"\/>\n\n\n\n<h2 class=\"wp-block-heading h6 has-blue-color has-text-color has-link-color wp-elements-544c286e42c2c9af826a18d5a0232c19\" id=\"new-research-1\">NEW RESEARCH<\/h2>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"evaluating-the-feasibility-of-visual-imagery-for-an-eeg-based-brain-computer-interface\">Evaluating the Feasibility of Visual Imagery for an EEG-Based Brain\u2013Computer Interface<\/h2>\n\n\n\n<p>Brain signals recorded via non-invasive electroencephalography (EEG) could help patients with severe neuromuscular disorders communicate with and control the world around them. Brain-computer interface (BCI) technology could use visual imagery, or the mental simulation of visual information from memory, as an effective control paradigm, directly conveying the user\u2019s intention.<\/p>\n\n\n\n<p>Initial investigations have been unable to fully evaluate the capabilities of true spontaneous visual mental imagery. One major limitation is that the target image is typically displayed immediately preceding the imagery period. This paradigm does not capture spontaneous mental imagery, as would be necessary in an actual BCI application, but something more akin to short-term retention in visual working memory.<\/p>\n\n\n\n<p>In a recent paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/evaluating-the-feasibility-of-visual-imagery-for-an-eeg-based-brain-computer-interface\/\">Evaluating the Feasibility of Visual Imagery for an EEG-Based Brain\u2013Computer Interface<\/a>, researchers from Microsoft and external colleagues show that short-term visual imagery following the presentation of a specific target image provides a stronger, more easily classifiable neural signature in EEG than spontaneous visual imagery from long-term memory following an auditory cue for the image. This research, published in <em>IEEE Transactions on Neural Systems and Rehabilitation Engineering, <\/em>provides the first direct comparison of short-term and long-term visual imagery tasks and provides greater insight into the feasibility of using visual imagery as a BCI control strategy.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--2\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/evaluating-the-feasibility-of-visual-imagery-for-an-eeg-based-brain-computer-interface\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\n\n\t<div class=\"border-bottom border-top border-gray-300 mt-5 mb-5 msr-promo text-center text-md-left alignwide\" data-bi-aN=\"promo\" data-bi-id=\"1144028\">\n\t\t\n\n\t\t<p class=\"msr-promo__label text-gray-800 text-center text-uppercase\">\n\t\t<span class=\"px-4 bg-white display-inline-block font-weight-semibold small\">PODCAST SERIES<\/span>\n\t<\/p>\n\t\n\t<div class=\"row pt-3 pb-4 align-items-center\">\n\t\t\t\t\t\t<div class=\"msr-promo__media col-12 col-md-5\">\n\t\t\t\t<a class=\"bg-gray-300 display-block\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/story\/the-ai-revolution-in-medicine-revisited\/\" aria-label=\"The AI Revolution in Medicine, Revisited\" data-bi-cN=\"The AI Revolution in Medicine, Revisited\" target=\"_blank\">\n\t\t\t\t\t<img decoding=\"async\" class=\"w-100 display-block\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2025\/06\/Episode7-PeterBillSebastien-AIRevolution_Hero_Feature_River_No_Text_1400x788.jpg\" alt=\"Illustrated headshot of Bill Gates, Peter Lee, and S\u00e9bastien Bubeck\" \/>\n\t\t\t\t<\/a>\n\t\t\t<\/div>\n\t\t\t\n\t\t\t<div class=\"msr-promo__content p-3 px-5 col-12 col-md\">\n\n\t\t\t\t\t\t\t\t\t<h2 class=\"h4\">The AI Revolution in Medicine, Revisited<\/h2>\n\t\t\t\t\n\t\t\t\t\t\t\t\t<p id=\"the-ai-revolution-in-medicine-revisited\" class=\"large\">Join Microsoft\u2019s Peter Lee on a journey to discover how AI is impacting healthcare and what it means for the future of medicine.<\/p>\n\t\t\t\t\n\t\t\t\t\t\t\t\t<div class=\"wp-block-buttons justify-content-center justify-content-md-start\">\n\t\t\t\t\t<div class=\"wp-block-button\">\n\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/story\/the-ai-revolution-in-medicine-revisited\/\" aria-describedby=\"the-ai-revolution-in-medicine-revisited\" class=\"btn btn-brand glyph-append glyph-append-chevron-right\" data-bi-cN=\"The AI Revolution in Medicine, Revisited\" target=\"_blank\">\n\t\t\t\t\t\t\tListen now\t\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<\/div><!--\/.msr-promo__content-->\n\t<\/div><!--\/.msr-promo__inner-wrap-->\n\t<\/div><!--\/.msr-promo-->\n\t\n\n\n<h2 class=\"wp-block-heading h6 has-blue-color has-text-color has-link-color wp-elements-3545725d5534823a7062db4f0bc7ea2e\" id=\"new-research-2\">NEW RESEARCH<\/h2>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"evolving-roles-and-workflows-of-creative-practitioners-in-the-age-of-generative-ai\">Evolving Roles and Workflows of Creative Practitioners in the Age of Generative AI<\/h2>\n\n\n\n<p>Many creative practitioners \u2013 designers, software developers, and architects, for example \u2013 are using generative AI models to produce text, images, and other assets. While human-computer interaction (HCI) research explores specific generative AI models and creativity support tools, little is known about practitioners\u2019 evolving roles and workflows with models across a project\u2019s stages. This knowledge could help guide the development of the next generation of creativity support tools.<\/p>\n\n\n\n<p>In a recent paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/evolving-roles-and-workflows-of-creative-practitioners-in-the-age-of-generative-ai\/\">Evolving Roles and Workflows of Creative Practitioners in the Age of Generative AI<\/a>, researchers from Microsoft and the University of California-San Diego, contribute to this knowledge by employing a triangulated method to capture information from interviews, videos, and survey responses of creative practitioners reflecting on projects they completed with generative AI. Their observations help uncover a set of factors that capture practitioners\u2019 perceived roles, challenges, benefits, and interaction patterns when creating with generative AI. From these factors, the researchers offer insights and propose design opportunities and priorities that serve to encourage reflection from the wider community of creativity support tools and generative AI stakeholders, such as systems creators, researchers, and educators, on how to develop systems that meet the needs of creatives in human-centered ways.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--3\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/evolving-roles-and-workflows-of-creative-practitioners-in-the-age-of-generative-ai\/\">Read the paper<\/a><\/div>\n<\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dots\"\/>\n\n\n\n<h2 class=\"wp-block-heading h6 has-blue-color has-text-color has-link-color wp-elements-cf8617e912b5a80376fc6cf7f22fabc5\" id=\"new-research-3\">NEW RESEARCH<\/h2>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"it-s-like-a-rubber-duck-that-talks-back-understanding-generative-ai-assisted-data-analysis-workflows-through-a-participatory-prompting-study\">&#8220;It&#8217;s like a rubber duck that talks back&#8221;: Understanding Generative AI-Assisted Data Analysis Workflows through a Participatory Prompting Study<\/h2>\n\n\n\n<p>End-user tools based on generative AI can help people complete many tasks. One such task is data analysis, which is notoriously challenging for non-experts, but also holds much potential for AI. To understand how data analysis workflows can be assisted or impaired by generative AI, researchers from Microsoft conducted a study using Bing Chat via participatory prompting, a newer methodology in which users and researchers reflect together on tasks through co-engagement with generative AI. The recent paper: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/its-like-a-rubber-duck-that-talks-back-understanding-generative-ai-assisted-data-analysis-workflows-through-a-participatory-prompting-study\/\">&#8220;It&#8217;s like a rubber duck that talks back&#8221;: Understanding Generative AI-Assisted Data Analysis Workflows through a Participatory Prompting Study<\/a>, demonstrates the value of the participatory prompting method. The researchers found that generative AI benefits the information foraging and sensemaking loops of data analysis in specific ways, but also introduces its own barriers and challenges, arising from the difficulties of query formulation, specifying context, and verifying results. Based on these findings, the paper presents several implications for future AI research and the design of new generative AI interactions.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-outline is-style-outline--4\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/its-like-a-rubber-duck-that-talks-back-understanding-generative-ai-assisted-data-analysis-workflows-through-a-participatory-prompting-study\/\">Read the paper<\/a><\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>In this issue: Skeleton Posterior-guided OpTimization (SPOT) exhibits potential in various causal discovery tasks; Using visual imagery for an EEG-based brain\u2013computer interface; Developing human-centered AI systems to assist creative professionals.<\/p>\n","protected":false},"author":37583,"featured_media":1059216,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_hide_image_in_river":0,"footnotes":""},"categories":[1],"tags":[],"research-area":[13556,13554,13559],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[243984],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-1059168","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research-blog","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-research-area-social-sciences","msr-locale-en_us","msr-post-option-blog-homepage-featured"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[199561,199565],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[144923,550641,578422,714577],"related-projects":[511097,483294,661380],"related-events":[],"related-researchers":[{"type":"user_nicename","value":"Justin Ding","user_id":32435,"display_name":"Justin Ding","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/juding\/\" aria-label=\"Visit the profile page for Justin Ding\">Justin Ding<\/a>","is_active":false,"last_first":"Ding, Justin","people_section":0,"alias":"juding"},{"type":"user_nicename","value":"Shi Han","user_id":33618,"display_name":"Shi Han","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/shihan\/\" aria-label=\"Visit the profile page for Shi Han\">Shi Han<\/a>","is_active":false,"last_first":"Han, Shi","people_section":0,"alias":"shihan"},{"type":"user_nicename","value":"Dongmei Zhang","user_id":31665,"display_name":"Dongmei Zhang","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/dongmeiz\/\" aria-label=\"Visit the profile page for Dongmei Zhang\">Dongmei Zhang<\/a>","is_active":false,"last_first":"Zhang, Dongmei","people_section":0,"alias":"dongmeiz"},{"type":"user_nicename","value":"Ivan Tashev","user_id":32127,"display_name":"Ivan Tashev","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/ivantash\/\" aria-label=\"Visit the profile page for Ivan Tashev\">Ivan Tashev<\/a>","is_active":false,"last_first":"Tashev, Ivan","people_section":0,"alias":"ivantash"},{"type":"guest","value":"srishti-palani","user_id":"608673","display_name":"Srishti Palani","author_link":"<a href=\"http:\/\/srishtipalani.github.io\/\" aria-label=\"Visit the profile page for Srishti Palani\">Srishti Palani<\/a>","is_active":true,"last_first":"Palani, Srishti","people_section":0,"alias":"srishti-palani"},{"type":"user_nicename","value":"Advait Sarkar","user_id":37146,"display_name":"Advait Sarkar","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/advait\/\" aria-label=\"Visit the profile page for Advait Sarkar\">Advait Sarkar<\/a>","is_active":false,"last_first":"Sarkar, Advait","people_section":0,"alias":"advait"},{"type":"user_nicename","value":"Sean Rintel","user_id":33579,"display_name":"Sean Rintel","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/serintel\/\" aria-label=\"Visit the profile page for Sean Rintel\">Sean Rintel<\/a>","is_active":false,"last_first":"Rintel, Sean","people_section":0,"alias":"serintel"},{"type":"user_nicename","value":"Lev Tankelevitch","user_id":43209,"display_name":"Lev Tankelevitch","author_link":"<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/levt\/\" aria-label=\"Visit the profile page for Lev Tankelevitch\">Lev Tankelevitch<\/a>","is_active":false,"last_first":"Tankelevitch, Lev","people_section":0,"alias":"levt"}],"msr_type":"Post","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-960x540.jpg\" class=\"img-object-cover\" alt=\"Research Focus: July 22, 2024\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/07\/RF46-BlogHeroFeature-1400x788-1.jpg 1400w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","byline":"","formattedDate":"July 31, 2024","formattedExcerpt":"In this issue: Skeleton Posterior-guided OpTimization (SPOT) exhibits potential in various causal discovery tasks; Using visual imagery for an EEG-based brain\u2013computer interface; Developing human-centered AI systems to assist creative professionals.","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/1059168","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/37583"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=1059168"}],"version-history":[{"count":12,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/1059168\/revisions"}],"predecessor-version":[{"id":1059930,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/1059168\/revisions\/1059930"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/1059216"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=1059168"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=1059168"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=1059168"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=1059168"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=1059168"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=1059168"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=1059168"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=1059168"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=1059168"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=1059168"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=1059168"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}