{"id":825097,"date":"2022-03-09T19:06:21","date_gmt":"2022-03-10T03:06:21","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&#038;p=825097"},"modified":"2024-02-06T22:32:45","modified_gmt":"2024-02-07T06:32:45","slug":"gesture-generation-for-service-robots","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/gesture-generation-for-service-robots\/","title":{"rendered":"Gesture generation for service robots"},"content":{"rendered":"<section class=\"mb-3 moray-highlight\">\n\t<div class=\"card-img-overlay mx-lg-0\">\n\t\t<div class=\"card-background  has-background- card-background--full-bleed\">\n\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1253\" height=\"404\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/03\/toppage-62295c4d361f7.png\" class=\"attachment-full size-full\" alt=\"ARR_HRI\" style=\"object-position: 53% 56%\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/03\/toppage-62295c4d361f7.png 1253w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/03\/toppage-62295c4d361f7-300x97.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/03\/toppage-62295c4d361f7-1024x330.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/03\/toppage-62295c4d361f7-768x248.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/03\/toppage-62295c4d361f7-240x77.png 240w\" sizes=\"auto, (max-width: 1253px) 100vw, 1253px\" \/>\t\t<\/div>\n\t\t<!-- Foreground -->\n\t\t<div class=\"card-foreground d-flex mt-md-n5 my-lg-5 px-g px-lg-0\">\n\t\t\t<!-- Container -->\n\t\t\t<div class=\"container d-flex mt-md-n5 my-lg-5 align-self-center\">\n\t\t\t\t<!-- Card wrapper -->\n\t\t\t\t<div class=\"w-100 w-lg-col-5\">\n\t\t\t\t\t<!-- Card -->\n\t\t\t\t\t<div class=\"card material-md-card py-5 px-md-5\">\n\t\t\t\t\t\t<div class=\"card-body \">\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n<h1 class=\"wp-block-heading\" id=\"gesture-generation-for-service-robots\">Gesture generation for service robots<\/h1>\n\n\n\n<p><\/p>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n<h4 class=\"wp-block-heading\" id=\"about-the-project\">About the project<\/h4>\n\n\n\n<p>One of the essential functions for service-robots, like other AI systems, is the conversational capability that facilitates the interaction with humans. Here, the difference between service robots and other AI systems is that robots perform physical action. Namely, one of the central issues in service-robot HRI is to generation of gestures along robot interactions. We are working on designing a basic library for gesture generation, gesture authoring tools, referred to as&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/LabanotationSuite\" target=\"_blank\" rel=\"noopener noreferrer\">LabanSuite<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, to get customized libraries from human gestures, and&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/gestureBotDesignKit\/\" target=\"_blank\" rel=\"noopener noreferrer\">handy DIY kit<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;to represent such gestures.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"research-topics\">Research topics<\/h4>\n\n\n\n<ul class=\"wp-block-list\">\n<li><em><em>Gesture-generation system from textual input.<\/em>&nbsp;<\/em>Appropriate co-speech gestures will convey intentions or emotions effectively and thus facilitate the interaction with a human. We have been developing gesture-generation systems that generate gestures based on the semantics of the text. Gestures are generated by using gesture libraries that store human co-speech gestures in a hardware-independent format. With the use of recent conversational agents, we could build a natural human-robot interface for arbitrary robot hardware.<\/li>\n\n\n\n<li><em>Gesture authoring tool and handy DIY kit.<\/em> We have been developing an open-source project&nbsp;<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/LabanotationSuite\" target=\"_blank\" rel=\"noopener noreferrer\">LabanSuite<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> for authorizing human gestures capture by a Microsoft Kinect sensor. Gestures are encoded as Labanotation, a human dance notation, which is hardware-independent. We also provide a <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/gestureBotDesignKit\/\" target=\"_blank\" rel=\"noopener noreferrer\">handy DIY kit<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>&nbsp;for developing your own service robot and playing gestures.<\/li>\n\n\n\n<li><em>Robot ego-noise reduction.<\/em> Recent advancements in speech recognition have improved the listening capability of service robots. However, the recorded human speech from robots&#8217; microphones often includes noises such as cooling fan noise, robot ego noise when they operate. Those noises can degrade the recognition performance. We have been developing noise filters that are aware of the profile of ego-noise.<\/li>\n<\/ul>\n\n\n\n\n\n<h2 class=\"wp-block-heading\" id=\"microsoft-applied-robotics-research-library\">Microsoft Applied Robotics Research Library<\/h2>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:5%\">\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"721\" height=\"719\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/02\/MicrosoftTeams-image-3.jpg\" alt=\"team logo\" class=\"wp-image-821842\" style=\"width:89px;height:89px\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/02\/MicrosoftTeams-image-3.jpg 721w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/02\/MicrosoftTeams-image-3-300x300.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/02\/MicrosoftTeams-image-3-150x150.jpg 150w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/02\/MicrosoftTeams-image-3-180x180.jpg 180w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/02\/MicrosoftTeams-image-3-360x360.jpg 360w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/02\/MicrosoftTeams-image-3-181x180.jpg 181w\" sizes=\"auto, (max-width: 721px) 100vw, 721px\" \/><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:95%\">\n<p>Microsoft Applied Robotics Research Library is a collection of source codes that are used for robotic applications including Learning-from-Observation, Human-robot-interaction, and Robot gesture generation.<\/p>\n<\/div>\n<\/div>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/github.com\/microsoft\/LabanotationSuite\" target=\"_blank\" rel=\"noopener noreferrer\">LabanotationSuite &#8211; open source software tools to give service robots the ability to perform human-like gestures<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n\n\n\n<li><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/github.com\/microsoft\/GPT-Enabled-HSR-CoSpeechGestures\">Sample code to test co-speech gestures using Toyota HSR robot <span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n<\/ul>\n\n\n","protected":false},"excerpt":{"rendered":"<p>One of the essential functions for service-robots, like other AI systems, is the conversational capability that facilitates the interaction with humans. Here, the difference between service robots and other AI systems is that robots perform physical action. Namely, one of the central issues in service-robot HRI is to generation of gestures along robot interactions. We [&hellip;]<\/p>\n","protected":false},"featured_media":825202,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13556,13554],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-825097","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[821905,821920,825145,825151,825217,940239],"related-downloads":[],"related-videos":[],"related-groups":[668253],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"guest","display_name":"Hitoshi Teshima","user_id":825121,"people_section":"Project members","alias":""},{"type":"guest","display_name":"Machiko Sato","user_id":825220,"people_section":"Project members","alias":""},{"type":"guest","display_name":"Jun Takamatsu","user_id":821884,"people_section":"Project members","alias":""},{"type":"guest","display_name":"Kazuhiro Sasabuchi","user_id":821887,"people_section":"Project members","alias":""},{"type":"guest","display_name":"Atsushi Kanehira","user_id":1005129,"people_section":"Project members","alias":""},{"type":"guest","display_name":"Hiroshi Kawasaki","user_id":825124,"people_section":"Collaborators","alias":""},{"type":"guest","display_name":"Minako Nakamura","user_id":825232,"people_section":"Collaborators","alias":""},{"type":"guest","display_name":"Yuta Nakashima","user_id":825130,"people_section":"Collaborators","alias":""},{"type":"guest","display_name":"Diego Thomas","user_id":825127,"people_section":"Collaborators","alias":""},{"type":"guest","display_name":"David Baumert","user_id":1005108,"people_section":"Alumni","alias":""},{"type":"guest","display_name":"Jekaterina Jaroslavceva","user_id":825136,"people_section":"Alumni","alias":""},{"type":"guest","display_name":"Masaru Takizawa","user_id":825157,"people_section":"Alumni","alias":""},{"type":"guest","display_name":"Miki Watanabe","user_id":825172,"people_section":"Alumni","alias":""}],"msr_research_lab":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/825097","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":12,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/825097\/revisions"}],"predecessor-version":[{"id":1005111,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/825097\/revisions\/1005111"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/825202"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=825097"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=825097"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=825097"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=825097"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=825097"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}