{"id":171483,"date":"2015-06-29T19:09:48","date_gmt":"2015-06-29T19:09:48","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/project\/semanticpaint-interactive-3d-labeling-and-learning-at-your-fingertips\/"},"modified":"2022-09-07T10:56:43","modified_gmt":"2022-09-07T17:56:43","slug":"semanticpaint-interactive-3d-labeling-and-learning-at-your-fingertips","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/semanticpaint-interactive-3d-labeling-and-learning-at-your-fingertips\/","title":{"rendered":"SemanticPaint: Interactive 3D Labeling and Learning at your Fingertips"},"content":{"rendered":"<p class=\"asset-content\">We present a new interactive approach to 3D scene understanding. Our system, SemanticPaint, allows users to simultaneously scan their environment, whilst interactively segmenting the scene simply by reaching out and touching any desired object or surface. Our system continuously learns from these segmentations, and labels new unseen parts of the environment. Unlike offline systems, where capture, labeling and batch learning often takes hours or even days to perform, our approach is fully online.<\/p>\n<p class=\"asset-content\">To be presented at SIGGRAPH &#8217;15. SemanticPaint is part of an ongoing collaboration between Microsoft Research and\u00a0University of Oxford.<\/p>\n<p class=\"asset-content\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-233703\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2015\/06\/semanticpaint_teasernew.png\" alt=\"semanticpaint_teasernew\" width=\"589\" height=\"203\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2015\/06\/semanticpaint_teasernew.png 589w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2015\/06\/semanticpaint_teasernew-300x103.png 300w\" sizes=\"auto, (max-width: 589px) 100vw, 589px\" \/><\/p>\n<div id=\"en-usprojectssemanticpaintdefault\" class=\"page-content\">\n<p>&nbsp;<\/p>\n<p><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/z_TcWC7yjj0\" target=\"_blank\" rel=\"noopener noreferrer\">SemanticPaint Video:<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><br \/>\n<span id=\"9d6b4fa7-7133-4f26-9a90-b09c0826b42c\" class=\"ImageBlock fn\"><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/z_TcWC7yjj0\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" id=\"Image9d6b4fa7-7133-4f26-9a90-b09c0826b42c\" class=\"BorderedImageWrapper\" title=\"SemanticPaint YouTube video\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/02\/\/semanticpaint-semanticpaintyoutube.png\" alt=\"SemanticPaint YouTube video\" \/><span class=\"sr-only\"> (opens in new tab)<\/span><\/a><span id=\"ImageCaption9d6b4fa7-7133-4f26-9a90-b09c0826b42c\" class=\"ImageCaptionCoreCss ImageCaption\"><\/SPAN><\/SPAN><\/span><\/span><\/p>\n<p><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/Fnty9Zn4d98\" target=\"_blank\" rel=\"noopener noreferrer\">Supplementary Material Video:<br \/>\n<\/a><span id=\"60dad5cc-4bb0-441d-b156-92545feb15d0\" class=\"ImageBlock fn\"><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/youtu.be\/Fnty9Zn4d98\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" id=\"Image60dad5cc-4bb0-441d-b156-92545feb15d0\" class=\"BorderedImageWrapper\" title=\"SemanticPaint supplementary material video\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/02\/\/semanticpaint-semanticpaintsupplyoutube.png\" alt=\"SemanticPaint supplementary material video\" \/><span class=\"sr-only\"> (opens in new tab)<\/span><\/a><span id=\"ImageCaption60dad5cc-4bb0-441d-b156-92545feb15d0\" class=\"ImageCaptionCoreCss ImageCaption\"><\/SPAN><\/SPAN><\/span><\/span><\/p>\n<p>Main Con<span id=\"eb02a816-d740-4bb2-8103-643905f4aa1d\" class=\"ImageBlock fn\"><\/SPAN>tacts: <\/span><\/p>\n<ul>\n<li><strong>Shahram Izadi<\/strong>, Microsoft Research: <a href=\"mailto:shahrami@microsoft.com\">shahrami@microsoft.com<\/a><br \/>\nInteractive 3D Technologies, Microsoft Research:\u00a0<a href=\"\"><\/a><\/li>\n<\/ul>\n<ul>\n<li><strong>Philip Torr<\/strong>, University of Oxford: <a href=\"mailto:philip.torr@eng.ox.ac.uk\">philip.torr@eng.ox.ac.uk<\/a><br \/>\nTorr Vision Group, University of Oxford:\u00a0<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/www.robots.ox.ac.uk\/~tvg\/\">http:\/\/www.robots.ox.ac.uk\/~tvg\/<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n<\/ul>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>We present a new interactive approach to 3D scene understanding. Our system, SemanticPaint, allows users to simultaneously scan their environment, whilst interactively segmenting the scene simply by reaching out and touching any desired object or surface. Our system continuously learns from these segmentations, and labels new unseen parts of the environment. Unlike offline systems, where [&hellip;]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13556,13562,13554],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-171483","msr-project","type-msr-project","status-publish","hentry","msr-research-area-artificial-intelligence","msr-research-area-computer-vision","msr-research-area-human-computer-interaction","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"6\/29\/2015","related-publications":[168293],"related-downloads":[],"related-videos":[322889],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Antonio Criminisi","user_id":41790,"people_section":"Section name 0","alias":"acriminisi"}],"msr_research_lab":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/171483","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":3,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/171483\/revisions"}],"predecessor-version":[{"id":875997,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/171483\/revisions\/875997"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=171483"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=171483"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=171483"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=171483"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=171483"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}