{"id":741481,"date":"2021-04-20T21:12:05","date_gmt":"2021-04-21T04:12:05","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-group&#038;p=741481"},"modified":"2023-08-10T10:21:30","modified_gmt":"2023-08-10T17:21:30","slug":"knowledge-and-language","status":"publish","type":"msr-group","link":"https:\/\/www.microsoft.com\/en-us\/research\/group\/knowledge-and-language\/","title":{"rendered":"Knowledge and Language Team"},"content":{"rendered":"<section class=\"mb-3 moray-highlight\">\n\t<div class=\"card-img-overlay mx-lg-0\">\n\t\t<div class=\"card-background  has-background-gable-green card-background--full-bleed\">\n\t\t\t\t\t<\/div>\n\t\t<!-- Foreground -->\n\t\t<div class=\"card-foreground d-flex mt-md-n5 my-lg-5 px-g px-lg-0\">\n\t\t\t<!-- Container -->\n\t\t\t<div class=\"container d-flex mt-md-n5 my-lg-5 align-self-center\">\n\t\t\t\t<!-- Card wrapper -->\n\t\t\t\t<div class=\"w-100 w-lg-col-5\">\n\t\t\t\t\t<!-- Card -->\n\t\t\t\t\t<div class=\"card material-md-card py-5 px-md-5\">\n\t\t\t\t\t\t<div class=\"card-body \">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/lab\/microsoft-research-redmond\/\" class=\"icon-link icon-link--reverse mb-2\" data-bi-cN=\"Return to Microsoft Research Lab - Redmond\">\n\t\t\t\t\t\t\t\t\t<span class=\"c-glyph glyph-chevron-left\" aria-hidden=\"true\"><\/span>\n\t\t\t\t\t\t\t\t\tReturn to Microsoft Research Lab &#8211; Redmond\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n<h1 class=\"wp-block-heading h2\" id=\"knowledge-and-language-team\">Knowledge and Language Team<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n<p>The Knowledge and Language Team is part of the <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/group\/cognitive-services-research\/\">Azure Cognitive Services Research (CSR) group<\/a>, focusing on cutting edge research and the development of the next generation framework for knowledge and natural language processing.<\/p>\n\n\n\n<p>We are working on: 1) Knowledge-enhanced Language Model, 2) Summarization, 3) Few-shot and Prompt Learning, and 4) Multimodal Learning. We develop state-of-the-art deep learning technologies for both research and business applications.<\/p>\n\n\n\n<p>Our work has resulted in multiple publications in top NLP conferences and achieving human parity in HellaSwag, CommonsenseQA and CoQA, 1st places in CommonGen, FEVER, ARC and SQuAD v1.0 leaderboard.<\/p>\n\n\n\n<p>Our recent work covers:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Prompt optimization in a gradient descent style [&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/arxiv.org\/pdf\/2305.03495.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Paper<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;]<\/li>\n\n\n\n<li>Leverage GPT-4 to build SoTA NLG evaluator [&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/arxiv.org\/pdf\/2303.16634.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Paper<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> ]<\/li>\n\n\n\n<li>The &#8220;Impossible Triangle&#8221; of pre-trained language models [&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/arxiv.org\/pdf\/2204.06130.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Paper<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> ]<\/li>\n\n\n\n<li>Integrative multimodal learning framework i-Code [&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/arxiv.org\/pdf\/2205.01818.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">i-Code v1<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;|&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/arxiv.org\/pdf\/2305.12311.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">i-Code v2<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;|&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/arxiv.org\/pdf\/2305.11846.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">i-Code v3<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;|&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/arxiv.org\/pdf\/2212.02623.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">i-Code Doc<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;]<\/li>\n<\/ul>\n\n\n\n<p>Updates<\/p>\n\n\n\n<p>May 5, 2023: Chenguang Zhu and Prof. Diyi Yang from Stanford University gave the tutorial on&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/zcgzcgzcg1\/EACL2023_Tutorial_Dialogue_Summarization\" target=\"_blank\" rel=\"noopener noreferrer\">Summarization of Dialogues and Conversations At Scale<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;at EACL 2023.<\/p>\n\n\n\n<p>May 1, 2023: 5 papers accepted at ACL 2023.<\/p>\n\n\n\n<p>Apr. 25 2023: Felipe Vieira Frujeri&#8217;s paper &#8220;DOTE: Rethinking (Predictive) WAN Traffic Engineering&#8221; got Best Paper Award at NSDI 2023.<\/p>\n\n\n\n<p>Apr. 24, 2023: 2 papers accepted at ICML 2023.<\/p>\n\n\n\n<p>Feb. 28, 2023: 2 papers accepted at CVPR 2023.<\/p>\n\n\n\n<p>Feb. 27, 2023: Chenguang Zhu gave the talk &#8220;How We Achieved Human Parity in CommonsenseQA \u2013 Fusing Knowledge into Language Models&#8221; at Singapore Management University. [&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/drive.google.com\/file\/d\/1KW-3a2ynNwXj6t0U0v6ok2YFdtpePpTy\/view\" target=\"_blank\" rel=\"noopener noreferrer\">Slides<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> ]<\/p>\n\n\n\n<p>Feb. 27, 2023: We gave the tutorial on <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/zcgzcgzcg1\/WSDM2023_Knowledge_NLP_Tutorial\/\" target=\"_blank\" rel=\"noopener noreferrer\">Knowledge-Augmented Methods for Natural Language Processing<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> at WSDM 2023.<\/p>\n\n\n\n<p>Feb. 23, 2023: 1 paper accepted at TACL.<\/p>\n\n\n\n<p>Feb. 13, 2023: We organized&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/knowledge-nlp.github.io\/aaai2023\/\" target=\"_blank\" rel=\"noopener noreferrer\">The Workshop on Knowledge Augmented Methods for NLP (KnowledgeNLP-AAAI\u201923)<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;at AAAI 2023.<\/p>\n\n\n\n<p>Jan. 20, 2023: 2 papers accepted at ICLR 2023.<\/p>\n\n\n\n<p>Nov. 18, 2022: 1 paper accepted at AAAI 2023.<\/p>\n\n\n\n<p>Oct. 6, 2022: 11 papers accepted at EMNLP 2022.<\/p>\n\n\n\n<p><\/p>\n\n\n","protected":false},"excerpt":{"rendered":"<p>The Knowledge and Language Team focuses on cutting edge research and development of the next generation framework for knowledge and natural language processing.<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_group_start":"","footnotes":""},"research-area":[13556,13545,13555],"msr-group-type":[243694],"msr-locale":[268875],"msr-impact-theme":[],"class_list":["post-741481","msr-group","type-msr-group","status-publish","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-language-technologies","msr-research-area-search-information-retrieval","msr-group-type-group","msr-locale-en_us"],"msr_group_start":"","msr_detailed_description":"","msr_further_details":"","msr_hero_images":[],"msr_research_lab":[199565],"related-researchers":[{"type":"user_nicename","display_name":"Yang Liu","user_id":39594,"people_section":"Section name 0","alias":"yaliu10"},{"type":"user_nicename","display_name":"Shuohang Wang","user_id":39678,"people_section":"Section name 0","alias":"shuowa"}],"related-publications":[845977,892380,891066,887520,886998,886611,883824,880455,851014,848155,847411,893601,842326,829249,827617,826330,820222,817771,817309,804583,802357,941436,946218,944655,942132,942126,942117,942048,942036,941796,941727,941568,792371,940389,932094,923253,913755,907008,905796,905703,904239,897264,654966,711862,708523,697399,696492,695946,695127,695118,665019,664602,656175,715066,648081,644682,644673,640596,603897,603846,590461,589327,558528,772204,787231,786202,786148,786139,784882,784051,779719,773680,772213,465852,771661,771646,769408,763792,747088,744739,744562,740710,732424],"related-downloads":[],"related-videos":[],"related-projects":[],"related-events":[],"related-opportunities":[],"related-posts":[806026,945684,951717,955086],"tab-content":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/741481","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-group"}],"version-history":[{"count":66,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/741481\/revisions"}],"predecessor-version":[{"id":959481,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/741481\/revisions\/959481"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=741481"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=741481"},{"taxonomy":"msr-group-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group-type?post=741481"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=741481"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=741481"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}