{"id":609591,"date":"2019-08-01T13:00:30","date_gmt":"2019-08-01T20:00:30","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&#038;p=609591"},"modified":"2019-09-19T11:04:19","modified_gmt":"2019-09-19T18:04:19","slug":"recent-advances-in-unsupervised-image-to-image-translation","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/recent-advances-in-unsupervised-image-to-image-translation\/","title":{"rendered":"Recent Advances in Unsupervised Image-to-Image Translation"},"content":{"rendered":"<p>Unsupervised image-to-image translation aims to map an image drawn from one distribution to an analogous image in a different distribution, without seeing any example pairs of analogous images. For example, given an image of a landscape taken in the summer, one may want to know what it would look like in the winter. There is not just a single answer. One could imagine many possibilities due to differences in weather, timing, lighting, etc. However, existing work can only deterministically produce a single output given the same input. To address this limitation, we propose a Multimodal Unsupervised Image-to-image Translation (MUNIT) framework that is able to produce diverse and realistic translation results. We further extend our model to the few-shot scenario, where only a few images in the target distribution are available and only at test time. This model, named FUNIT, is trained to translate images between many different pairs of distributions using a few examples so that it can be generalized to unseen target distributions. Extensive experimental comparisons demonstrate the effectiveness of the proposed frameworks.<\/p>\n<p>[<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/09\/44004_Recent_Advances_in_Unsupervised_Image_to_Image_Translation.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Slides<\/a>]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Unsupervised image-to-image translation aims to map an image drawn from one distribution to an analogous image in a different distribution, without seeing any example pairs of analogous images. For example, given an image of a landscape taken in the summer, one may want to know what it would look like in the winter. There is [&hellip;]<\/p>\n","protected":false},"featured_media":609600,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_hide_image_in_river":0,"footnotes":""},"research-area":[13556,13562,13551],"msr-video-type":[206954],"msr-locale":[268875],"msr-post-option":[],"msr-session-type":[],"msr-impact-theme":[],"msr-pillar":[],"msr-episode":[],"msr-research-theme":[],"class_list":["post-609591","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-computer-vision","msr-research-area-graphics-and-multimedia","msr-video-type-microsoft-research-talks","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/NsPMlDsRCkM","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/609591","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":2,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/609591\/revisions"}],"predecessor-version":[{"id":609642,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/609591\/revisions\/609642"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/609600"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=609591"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=609591"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=609591"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=609591"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=609591"},{"taxonomy":"msr-session-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-session-type?post=609591"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=609591"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=609591"},{"taxonomy":"msr-episode","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-episode?post=609591"},{"taxonomy":"msr-research-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-theme?post=609591"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}