{"id":199780,"date":"2012-01-09T12:10:52","date_gmt":"2012-01-09T12:10:52","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/events\/techfest-2012\/"},"modified":"2025-08-06T12:02:40","modified_gmt":"2025-08-06T19:02:40","slug":"techfest-2012","status":"publish","type":"msr-event","link":"https:\/\/www.microsoft.com\/en-us\/research\/event\/techfest-2012\/","title":{"rendered":"TechFest 2012"},"content":{"rendered":"\n\n<p>The latest thinking.\u00a0 The freshest ideas.<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<p>Each winter, TechFest exposes Microsoft employees and guests to compelling research projects from Microsoft Research\u2019s labs around the world. Researchers demonstrate their most recent achievements\u2014and the technologies those efforts have produced. TechFest enables product teams and researchers to interact, often leading to the transfer of groundbreaking technologies into Microsoft products.<\/p>\n<p>Explore the projects. Watch the videos.\u00a0Immerse yourself in TechFest content, and see how today\u2019s future will become tomorrow\u2019s reality.<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<p>Each year during TechFest, Microsoft Research displays a collection of cutting-edge research projects that offer new functionalities for Microsoft products and, often, for the greater research ecosystem. Many of those projects are discussed below.<\/p>\n<p>\t<div data-wp-context='{\"items\":[]}' data-wp-interactive=\"msr\/accordion\">\n\t\t\t\t\t<div class=\"clearfix\">\n\t\t\t\t<div\n\t\t\t\t\tclass=\"btn-group align-items-center mb-g float-sm-right\"\n\t\t\t\t\tdata-bi-aN=\"accordion-collapse-controls\"\n\t\t\t\t>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Expand all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onExpandAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tExpand all\t\t\t\t\t<\/button>\n\t\t\t\t\t<span aria-hidden=\"true\"> | <\/span>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Collapse all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onCollapseAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tCollapse all\t\t\t\t\t<\/button>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t\t\t<ul class=\"msr-accordion\">\n\t\t\t\t\t\t\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7278\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7278\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7277\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tA Natural UI for Polling Students in a Classroom\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7277\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7278\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>One of the biggest challenges facing teachers in a classroom is to gauge whether students are keeping up with the lesson. This challenge is especially acute in distance-education programs, because of the physical separation between students and teachers. This project delivers a new, low-cost technique for instantly polling students in the classroom. The approach enables teachers to ask a multiple-choice question to the class. Students respond by holding up a sheet of paper, which has a printed code, similar to a QR code, which encodes their answers, as well as their student IDs. A webcam automatically recognizes the response and using computer vision technology aggregates the responses for immediate evaluation by the teacher. Initial trials in schools in Bangalore, India show the system is as accurate as a written test, as fast as a show of hands, and is at least 10 times cheaper than alternative electronic solutions.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7280\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7280\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7279\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tApplied Sciences Group: Interactive Displays\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7279\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7280\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<ul>\n<li><strong>Telepresence using Wedge Technology:<\/strong> Glassless 3-D display with a correct camera pose and view pose for a live view-dependent 3-D Window Telepresence experience.<\/li>\n<li><strong>Behind the Screen Overlay Interactions:<\/strong> Behind-the-screen interaction with a transparent OLED with view-dependent, depth-corrected gaze.<\/li>\n<li><strong>Seeing Displays:<\/strong> Uses flat lenses (wedge) to see through a semi-transparent OLED for novel above screen gesture and scanning interactions.<\/li>\n<li><strong>High-Performance Touch:<\/strong> A touch-display system with two orders of magnitude less latency than current systems.<\/li>\n<li><strong>Mayhem:<\/strong> A freely available, open source Windows application that lets almost anyone use their computer to do stuff automatically across all their devices. Just select an event (e.g. your favorite stock hit a trigger value, a change in the weather, say something to your Kinect, etc.) and then select a reaction (e.g. advance a PowerPoint slide, turn on a lamp, start playing a movie, etc.), and within seconds, you have a connection running.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/lab\/applied-sciences-group\/\" target=\"_blank\">Learn more >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7282\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7282\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7281\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tAutomatic Building Parsing in Urban Areas\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7281\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7282\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Intensive interest exists in intelligent online 3-D exploration and navigation in urban areas. Thus, understanding the 3-D structures of urban areas from captured images or videos becomes indispensable. Automatic Building Parsing in Urban Areas is a tool that can automatically detect the fa\u00e7ades in a single image or in multiple images. Beyond locating the position, it also can compute the geometry of each fa\u00e7ade\u2014the orientation of the plane\u2014 without human interaction. The tool is implemented using our newly developed Robust Principal Component Analysis SDK and requires little response time. The interactive speed is demonstrated by a touch-based application that lets users dive into a 3-D tour of an urban area from a single image. This tool directly benefits the navigation and smooth transition of bird\u2019s-eye-view images and could become a fundamental tool for many applications in urban areas. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/candidate-talk-matching-and-3d-reconstruction-in-urban-environments\/\">Learn more >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7284\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7284\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7283\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tAutomatic Text Pop-Up for Web Images\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7283\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7284\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>The Bing home page provides teaser captions for an interesting image in the form of Bing tiles. The images are chosen carefully and the captions are written by a person to make it interesting. The Automatic \u201cText Pop-Up\u201d for Web Images application automatically generates similar text descriptions for a large fraction of the most popular images on the web. At the core of the system is an offline text-extraction process, in which the application mines the web for meaningful captions that relate to a given image. During this, the application checks sentence semantics for relevancy, diversity, and optimal structure, and performs content filtering. The results are indexed in a database. The front end of the application is integrated into the Bing Toolbar in IE. Whenever a user navigates to a webpage, the application queries the database and overlays text descriptions for the images on the webpage in the form of Bing tiles: the text pop-up.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7286\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7286\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7285\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tBeamatron\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7285\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7286\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Beamatron is a new, augmented-reality concept that combines a projector and a Kinect camera on a pan-tilt moving head. The moving head is used to place the projected image almost anywhere in a room. Meanwhile, the depth camera enables the correct warping of the displayed image for the shape of the projection surface and for the projected graphics to react in physically appropriate ways. For example, a projected virtual car can be driven on the floor of the room but will bump into obstacles or run over ramps. As another application, we consider the ability to bring notifications and other graphics to the attention of the user by automatically placing the graphics within the user\u2019s view.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/beamatron\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7288\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7288\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7287\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tBing-Enabled Azure Data Services for Enterprises\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7287\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7288\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>The goal of this project is to leverage the opportunity for enterprise applications to significantly benefit in novel ways from our Bing data assets, specifically, query logs, web crawl and social media data. This project illustrates the progress made so far. It identifies key Azure data services that have the potential to be widely useful for enterprises by leveraging the combination of Bing data assets, the Microsoft cloud computing infrastructure and deep data analytics. To bring home the opportunities, the project shows how Microsoft\u2019s enterprise software can leverage these data services, and illustrates Bing-enabled enhancements that SharePoint Search and Microsoft Office products and services can potentially leverage.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7290\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7290\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7289\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tChronoZoom: Big History with Big Data\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7289\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7290\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>There are thousands of digital libraries, archives, collections and repositories and no easy way to find these datasets for teaching, learning and research. To truly bridge humanities and sciences and pull them out of their silos we need a dynamic cloud based data visualization tool where educators, researchers and students can easily consume, compare and understand the history of the cosmos, earth, life and humanity. Where they can easily consume rich media sets like: audio, video, text, pdfs, charts, graphs and articles in one place and discover new possibilities.<\/p>\n<p>ChronoZoom will enable:<\/p>\n<ul>\n<li>Transitioning effortlessly between scales of one year to billions of years.<\/li>\n<li>Putting historical episodes, events, and trends in context without sacrificing precision.<\/li>\n<li>Comparing vast amounts of time-related data across different fields and disciplines.<\/li>\n<li>Gaining insight and the ability to shape the future by better understanding the cause-and-effect interplay between disciplines.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/chronozoom-an-infinite-canvas-in-time\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7292\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7292\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7291\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tCliplets: Juxtaposing Still and Dynamic Imagery\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7291\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7292\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>A still photograph is a limited format for capturing moments that span an interval of time. Video is the traditional method for recording durations of time, but the subjective \u201cmoment\u201d that one desires to capture is often lost in the chaos of shaky camerawork, irrelevant background clutter, and noise that dominates most casually recorded video clips. This work provides a creative lens used to focus on important aspects of a moment by performing spatiotemporal compositing and editing on video-clip input. This is an interactive app that uses semi-automated methods to give users the power to create \u201ccliplets\u201d\u2014a type of imagery that sits between stills and video from handheld videos. <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" target=\"_blank\" href=\"http:\/\/research.microsoft.com\/en-us\/um\/redmond\/projects\/cliplets\/\">Learn more >><span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/cliplets-juxtaposing-still-and-dynamic-imagery-2\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7294\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7294\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7293\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tFetchClimate! Building a Geographical Web Service\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7293\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7294\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>A huge amount of climate data is available, covering the whole of the Earth surface. But even the experts find it ludicrously difficult to get the climate information they need: locate data sets, negotiate permissions, download huge files, make sense of file formats, get to grips with yet another library, filter, interpolate, regrid, etc! Enter FetchClimate, a fast, intelligent climate-data-retrieval service that operates over Windows Azure. FetchClimate can be used through a Silverlight web interface or from inside any .NET program. FetchClimate works at any grid resolution from global to a few kilometers, in a range of years from 1900 to 2010, on days within a year, and for hours within a day. When multiple data sources could answer your query, FetchClimate automatically selects the most appropriate, returning the requested values along with the level of uncertainty and the origin of the data. The entire query can be shared as a single URL, enabling others to retrieve the identical information. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/fetchclimate\/\">Learn more >><\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/fetchclimate-building-a-geographical-web-service\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7296\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7296\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7295\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tGesture Recognition with Next-Generation Webcam\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7295\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7296\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>This project presents next-generation webcam hardware and software prototypes. The new prototype webcam has an extremely wider view angle than traditional webcams and can capture stereo movie and high-accuracy depth images simultaneously. Users can chat with stereoscopic video. Accurate depth-image processing can support not only all Kinect scenarios on a PC, but also a gesture-control user interface without a touch screen. Besides computer vision, the webcam includes a hardware accelerator and a new image-sensor design. The cost of the design is similar to that of current webcams, and the webcam potentially could be miniaturized as a mobile camera. The project showcases new user scenarios in playing games with this webcam.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/gesture-recognition-with-next-generation-webcam-2\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7298\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7298\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7297\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tHigh-Fidelity Facial-Animation Capturing\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7297\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7298\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>High-Fidelity Facial-Animation Capturing presents a new approach for acquiring high-\ufb01delity, 3-D facial performances with realistic dynamic wrinkles and \ufb01nely scaled facial details. This approach leverages state-of-the-art motion-capture technology and advanced 3-D scanning technology for facial-performance acquisition. The system can capture facial performances that match both the spatial resolution of static face scans and the acquisition speed of motion-capture systems.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/high-fidelity-facial-animation-capturing\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7300\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7300\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7299\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tHoloflector\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7299\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7300\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Holoflector is a unique, interactive augmented-reality mirror. Graphics are superimposed correctly on your own reflection to enable an augmented-reality experience unlike anything you have seen before. It also leverages the combined abilities of Kinect and Windows Phone to infer the position of your phone and render graphics that seem to hover above it.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/holoflector\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7302\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7302\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7301\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tIllumiShare\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7301\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7302\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>IllumiShare enables remote people to share any physical or digital object on any surface. It is a low-cost, peripheral device that looks like a desk lamp, and just like a lamp lights up a surface at which it is pointed, IllumiShare shares a surface. To do this, IllumiShare uses a camera-projector pair where the camera captures video of the local workspace and sends it to the remote space and the projector projects video of the remote workspace onto the local space. With IllumiShare, people can sketch together using real ink and paper, remote meeting attendees can interact with conference room whiteboards, and children can have remote play dates in which they play with real toys. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/illumishare\/\">Learn more >><\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/illumishare\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7304\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7304\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7303\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tLanguage-Learning Games on WP7 and Kinect\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7303\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7304\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Language-Learning Games on WP7 and Kinect is a language learning project focusing on how to facilitate delightful \u201cedutainment\u201d experiences across a range of Microsoft platforms:<\/p>\n<ul>\n<li><strong>SpatialEase:<\/strong> An Xbox 360 Kinect game for learning the language of space using \u201cembodied\u201d learning that connects language with thought and action. The learner must quickly interpret second-language commands, such as the translation of \u201cmove your left hand right,\u201d and move his or her body accordingly.<\/li>\n<li><strong>Tip Tap Tones:<\/strong> A Windows Phone game for learning Chinese sounds\u2014a highly effective mobile game for retraining the ears and the brain to perceive tonal Chinese syllables quickly and accurately.<\/li>\n<li><strong>Polyword Flashcards:<\/strong> Cloud flashcards with integrated skill-based games. Based on our adaptive learning algorithm, transferred to Bing Dictionary, we have created an HTML5 platform for deeply personalized learning that blends language study, gaming, and discovery. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/language-learning-games\/\">Learn more >><\/a><\/li>\n<\/ul>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7306\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7306\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7305\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tMicrosoft Translator Hub: Translation by Everyone for Everyone\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7305\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7306\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Microsoft Translator Hub implements a self-service model for building a highly customized automatic translation service between any two languages. Microsoft Translator Hub empowers language communities, service providers and corporations to create automatic translation systems, allowing speakers of one language to share and access knowledge with speakers of any other language. By enabling translation to languages that aren\u2019t supported by today\u2019s mainstream translation engines, this also keeps less widely spoken languages vibrant and in use for future generations. This Azure based service allows users to upload language data for custom training, and then build and deploy custom translation models. These machine translation services are accessible using the Microsoft Translator APIs or a Webpage widget. <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" target=\"_blank\" href=\"http:\/\/hub.microsofttranslator.com\/SignIn?returnURL=\/Home\/Index\">Learn more >><span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/microsoft-translator-hub-translation-by-everyone-for-everyone\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7308\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7308\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7307\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tNew Experiences in Search\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7307\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7308\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>This project explores ways for people to experience search that are complementary to fast, relevant search in response to queries. In particular, these concepts focus on new ways to spend time, rather than save time on the Web. Project components include:<\/p>\n<ul>\n<li>An organic kind of search that presents results which grow over time, drawing attention to the things you are most passionate about.<\/li>\n<li>A way of picturing and encapsulating search journeys so you can get pleasure from the voyage, as well as the destination.<\/li>\n<li>A way of packaging up search results so that they are collectable and can be given to others.<\/li>\n<li>As an ensemble, these demos emphasize self-expression and the creative use of search results over seeking and finding. They also focus on the importance of the search journey rather than the speed of delivering a result, recognizing that users often want to wander and explore the Web rather than quickly dip into it. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/new-experiences-in-search\/\">Learn more >><\/a><\/li>\n<\/ul>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/new-experiences-in-search\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7310\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7310\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7309\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tTurn a Monolingual TTS into Mixed Language\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7309\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7310\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>A voice user interface needs to output responses in text-to-speech (TTS) synthesized speech. Sometimes, it is desirable to have the response in mixed languages; in a foreign country, it would be convenient if a car-navigation system not fluent in that particular foreign language could hear instructions in mixed codes\u2014entities such as street names synthesized in the local language and routing directions in the user\u2019s native language. The mixed-coded TTS can be built by a truly bilingual speaker, but it is usually difficult to find such a person. This project shows a new approach in turning monolingual TTS into a multilingual one. From a speaker\u2019s monolingual recordings, the algorithm can render speech sentences of different languages for building mixed-coded, bilingual TTS systems. Recordings of 26 languages are used to build the TTS of corresponding languages. By using this new approach, we can synthesize any mixed-language pair out of the 26 languages. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/turning-a-monolingual-speaker-into-multi-lingual-speaker\/\">Learn more >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7312\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7312\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7311\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tWearable Multitouch Projector\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7311\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7312\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>This project is a depth-sensing and projection system that enables interactive multitouch applications on everyday surfaces. Beyond a shoulder-worn system, there is no instrumentation of the user or the environment. Foremost, on such surfaces\u2014without calibration\u2014Wearable Multitouch Interaction provides capabilities similar to those of a mouse or a touchscreen: X and Y locations in 2-D interfaces and whether fingers are \u201cclicked\u201d or hovering, enabling a wide variety of interactions. Reliable operation on the hands, for example, requires buttons to be 2.3 centimeters in diameter. Thus, it is now conceivable that anything one can do on today\u2019s mobile devices can be done in the palm of a hand.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/wearable-multitouch-projector\/\">Video >><\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7314\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7314\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7313\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tWhat\u2019s NUI? Explorations in Naturalness\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7313\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7314\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>The idea of natural user interfaces has motivated researchers to discover new modalities of interaction such as gesture, voice and touch. Through various demonstrations, we explore how one particular interaction mechanism, Kinect-based gestural interaction, can open up new experiences in different ways and in different contexts. In one demonstration, we show how a touchless system for 3D image use in vascular surgery requires a constrained space for gestural movements. In another, we show how Kinect technology can open up new interactions in the dark, for example helping us to \u2018feel\u2019 an invisible shape through sound feedback. Whether such demonstrations are natural is open for debate, but there is no doubt that these new user experiences can fire the imagination as to what the possibilities for interaction may be in future.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t\t\t\t\t<\/ul>\n\t<\/div>\n\t<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Each winter, TechFest exposes Microsoft employees and guests to compelling research projects from Microsoft Research\u2019s labs around the world. Researchers demonstrate their most recent achievements\u2014and the technologies those efforts have produced. TechFest enables product teams and researchers to interact, often leading to the transfer of groundbreaking technologies into Microsoft products.<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_startdate":"2012-03-06","msr_enddate":"2012-03-06","msr_location":"Redmond, WA, U.S.","msr_expirationdate":"","msr_event_recording_link":"","msr_event_link":"","msr_event_link_redirect":false,"msr_event_time":"","msr_hide_region":false,"msr_private_event":true,"msr_hide_image_in_river":0,"footnotes":""},"research-area":[13562],"msr-region":[256048],"msr-event-type":[197941,197944],"msr-video-type":[],"msr-locale":[268875],"msr-program-audience":[],"msr-post-option":[],"msr-impact-theme":[],"class_list":["post-199780","msr-event","type-msr-event","status-publish","hentry","msr-research-area-computer-vision","msr-region-global","msr-event-type-conferences","msr-event-type-hosted-by-microsoft","msr-locale-en_us"],"msr_about":"<!-- wp:msr\/event-details {\"title\":\"TechFest 2012\",\"backgroundColor\":\"grey\"} \/-->\n\n<!-- wp:msr\/content-tabs --><!-- wp:msr\/content-tab {\"title\":\"Summary\"} --><!-- wp:freeform --><p>The latest thinking.\u00a0 The freshest ideas.<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<p>Each winter, TechFest exposes Microsoft employees and guests to compelling research projects from Microsoft Research\u2019s labs around the world. Researchers demonstrate their most recent achievements\u2014and the technologies those efforts have produced. TechFest enables product teams and researchers to interact, often leading to the transfer of groundbreaking technologies into Microsoft products.<\/p>\n<p>Explore the projects. Watch the videos.\u00a0Immerse yourself in TechFest content, and see how today\u2019s future will become tomorrow\u2019s reality.<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<!-- \/wp:freeform --><!-- \/wp:msr\/content-tab --><!-- wp:msr\/content-tab {\"title\":\"Projects\"} --><!-- wp:freeform --><p>Each year during TechFest, Microsoft Research displays a collection of cutting-edge research projects that offer new functionalities for Microsoft products and, often, for the greater research ecosystem. Many of those projects are discussed below.<\/p>\n<p>\t<div data-wp-context='{\"items\":[]}' data-wp-interactive=\"msr\/accordion\">\n\t\t\t\t\t<div class=\"clearfix\">\n\t\t\t\t<div\n\t\t\t\t\tclass=\"btn-group align-items-center mb-g float-sm-right\"\n\t\t\t\t\tdata-bi-aN=\"accordion-collapse-controls\"\n\t\t\t\t>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Expand all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onExpandAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tExpand all\t\t\t\t\t<\/button>\n\t\t\t\t\t<span aria-hidden=\"true\"> | <\/span>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Collapse all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onCollapseAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tCollapse all\t\t\t\t\t<\/button>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t\t\t<ul class=\"msr-accordion\">\n\t\t\t\t\t\t\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7278\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7278\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7277\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tA Natural UI for Polling Students in a Classroom\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7277\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7278\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>One of the biggest challenges facing teachers in a classroom is to gauge whether students are keeping up with the lesson. This challenge is especially acute in distance-education programs, because of the physical separation between students and teachers. This project delivers a new, low-cost technique for instantly polling students in the classroom. The approach enables teachers to ask a multiple-choice question to the class. Students respond by holding up a sheet of paper, which has a printed code, similar to a QR code, which encodes their answers, as well as their student IDs. A webcam automatically recognizes the response and using computer vision technology aggregates the responses for immediate evaluation by the teacher. Initial trials in schools in Bangalore, India show the system is as accurate as a written test, as fast as a show of hands, and is at least 10 times cheaper than alternative electronic solutions.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7280\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7280\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7279\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tApplied Sciences Group: Interactive Displays\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7279\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7280\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<ul>\n<li><strong>Telepresence using Wedge Technology:<\/strong> Glassless 3-D display with a correct camera pose and view pose for a live view-dependent 3-D Window Telepresence experience.<\/li>\n<li><strong>Behind the Screen Overlay Interactions:<\/strong> Behind-the-screen interaction with a transparent OLED with view-dependent, depth-corrected gaze.<\/li>\n<li><strong>Seeing Displays:<\/strong> Uses flat lenses (wedge) to see through a semi-transparent OLED for novel above screen gesture and scanning interactions.<\/li>\n<li><strong>High-Performance Touch:<\/strong> A touch-display system with two orders of magnitude less latency than current systems.<\/li>\n<li><strong>Mayhem:<\/strong> A freely available, open source Windows application that lets almost anyone use their computer to do stuff automatically across all their devices. Just select an event (e.g. your favorite stock hit a trigger value, a change in the weather, say something to your Kinect, etc.) and then select a reaction (e.g. advance a PowerPoint slide, turn on a lamp, start playing a movie, etc.), and within seconds, you have a connection running.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/lab\/applied-sciences-group\/\" target=\"_blank\">Learn more &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7282\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7282\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7281\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tAutomatic Building Parsing in Urban Areas\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7281\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7282\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Intensive interest exists in intelligent online 3-D exploration and navigation in urban areas. Thus, understanding the 3-D structures of urban areas from captured images or videos becomes indispensable. Automatic Building Parsing in Urban Areas is a tool that can automatically detect the fa\u00e7ades in a single image or in multiple images. Beyond locating the position, it also can compute the geometry of each fa\u00e7ade\u2014the orientation of the plane\u2014 without human interaction. The tool is implemented using our newly developed Robust Principal Component Analysis SDK and requires little response time. The interactive speed is demonstrated by a touch-based application that lets users dive into a 3-D tour of an urban area from a single image. This tool directly benefits the navigation and smooth transition of bird\u2019s-eye-view images and could become a fundamental tool for many applications in urban areas. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/candidate-talk-matching-and-3d-reconstruction-in-urban-environments\/\">Learn more &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7284\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7284\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7283\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tAutomatic Text Pop-Up for Web Images\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7283\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7284\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>The Bing home page provides teaser captions for an interesting image in the form of Bing tiles. The images are chosen carefully and the captions are written by a person to make it interesting. The Automatic \u201cText Pop-Up\u201d for Web Images application automatically generates similar text descriptions for a large fraction of the most popular images on the web. At the core of the system is an offline text-extraction process, in which the application mines the web for meaningful captions that relate to a given image. During this, the application checks sentence semantics for relevancy, diversity, and optimal structure, and performs content filtering. The results are indexed in a database. The front end of the application is integrated into the Bing Toolbar in IE. Whenever a user navigates to a webpage, the application queries the database and overlays text descriptions for the images on the webpage in the form of Bing tiles: the text pop-up.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7286\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7286\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7285\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tBeamatron\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7285\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7286\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Beamatron is a new, augmented-reality concept that combines a projector and a Kinect camera on a pan-tilt moving head. The moving head is used to place the projected image almost anywhere in a room. Meanwhile, the depth camera enables the correct warping of the displayed image for the shape of the projection surface and for the projected graphics to react in physically appropriate ways. For example, a projected virtual car can be driven on the floor of the room but will bump into obstacles or run over ramps. As another application, we consider the ability to bring notifications and other graphics to the attention of the user by automatically placing the graphics within the user\u2019s view.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/beamatron\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7288\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7288\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7287\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tBing-Enabled Azure Data Services for Enterprises\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7287\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7288\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>The goal of this project is to leverage the opportunity for enterprise applications to significantly benefit in novel ways from our Bing data assets, specifically, query logs, web crawl and social media data. This project illustrates the progress made so far. It identifies key Azure data services that have the potential to be widely useful for enterprises by leveraging the combination of Bing data assets, the Microsoft cloud computing infrastructure and deep data analytics. To bring home the opportunities, the project shows how Microsoft\u2019s enterprise software can leverage these data services, and illustrates Bing-enabled enhancements that SharePoint Search and Microsoft Office products and services can potentially leverage.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7290\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7290\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7289\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tChronoZoom: Big History with Big Data\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7289\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7290\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>There are thousands of digital libraries, archives, collections and repositories and no easy way to find these datasets for teaching, learning and research. To truly bridge humanities and sciences and pull them out of their silos we need a dynamic cloud based data visualization tool where educators, researchers and students can easily consume, compare and understand the history of the cosmos, earth, life and humanity. Where they can easily consume rich media sets like: audio, video, text, pdfs, charts, graphs and articles in one place and discover new possibilities.<\/p>\n<p>ChronoZoom will enable:<\/p>\n<ul>\n<li>Transitioning effortlessly between scales of one year to billions of years.<\/li>\n<li>Putting historical episodes, events, and trends in context without sacrificing precision.<\/li>\n<li>Comparing vast amounts of time-related data across different fields and disciplines.<\/li>\n<li>Gaining insight and the ability to shape the future by better understanding the cause-and-effect interplay between disciplines.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/chronozoom-an-infinite-canvas-in-time\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7292\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7292\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7291\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tCliplets: Juxtaposing Still and Dynamic Imagery\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7291\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7292\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>A still photograph is a limited format for capturing moments that span an interval of time. Video is the traditional method for recording durations of time, but the subjective \u201cmoment\u201d that one desires to capture is often lost in the chaos of shaky camerawork, irrelevant background clutter, and noise that dominates most casually recorded video clips. This work provides a creative lens used to focus on important aspects of a moment by performing spatiotemporal compositing and editing on video-clip input. This is an interactive app that uses semi-automated methods to give users the power to create \u201ccliplets\u201d\u2014a type of imagery that sits between stills and video from handheld videos. <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" target=\"_blank\" href=\"http:\/\/research.microsoft.com\/en-us\/um\/redmond\/projects\/cliplets\/\">Learn more &gt;&gt;<\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/cliplets-juxtaposing-still-and-dynamic-imagery-2\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7294\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7294\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7293\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tFetchClimate! Building a Geographical Web Service\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7293\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7294\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>A huge amount of climate data is available, covering the whole of the Earth surface. But even the experts find it ludicrously difficult to get the climate information they need: locate data sets, negotiate permissions, download huge files, make sense of file formats, get to grips with yet another library, filter, interpolate, regrid, etc! Enter FetchClimate, a fast, intelligent climate-data-retrieval service that operates over Windows Azure. FetchClimate can be used through a Silverlight web interface or from inside any .NET program. FetchClimate works at any grid resolution from global to a few kilometers, in a range of years from 1900 to 2010, on days within a year, and for hours within a day. When multiple data sources could answer your query, FetchClimate automatically selects the most appropriate, returning the requested values along with the level of uncertainty and the origin of the data. The entire query can be shared as a single URL, enabling others to retrieve the identical information. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/fetchclimate\/\">Learn more &gt;&gt;<\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/fetchclimate-building-a-geographical-web-service\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7296\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7296\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7295\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tGesture Recognition with Next-Generation Webcam\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7295\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7296\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>This project presents next-generation webcam hardware and software prototypes. The new prototype webcam has an extremely wider view angle than traditional webcams and can capture stereo movie and high-accuracy depth images simultaneously. Users can chat with stereoscopic video. Accurate depth-image processing can support not only all Kinect scenarios on a PC, but also a gesture-control user interface without a touch screen. Besides computer vision, the webcam includes a hardware accelerator and a new image-sensor design. The cost of the design is similar to that of current webcams, and the webcam potentially could be miniaturized as a mobile camera. The project showcases new user scenarios in playing games with this webcam.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/gesture-recognition-with-next-generation-webcam-2\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7298\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7298\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7297\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tHigh-Fidelity Facial-Animation Capturing\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7297\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7298\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>High-Fidelity Facial-Animation Capturing presents a new approach for acquiring high-\ufb01delity, 3-D facial performances with realistic dynamic wrinkles and \ufb01nely scaled facial details. This approach leverages state-of-the-art motion-capture technology and advanced 3-D scanning technology for facial-performance acquisition. The system can capture facial performances that match both the spatial resolution of static face scans and the acquisition speed of motion-capture systems.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/high-fidelity-facial-animation-capturing\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7300\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7300\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7299\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tHoloflector\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7299\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7300\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Holoflector is a unique, interactive augmented-reality mirror. Graphics are superimposed correctly on your own reflection to enable an augmented-reality experience unlike anything you have seen before. It also leverages the combined abilities of Kinect and Windows Phone to infer the position of your phone and render graphics that seem to hover above it.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/holoflector\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7302\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7302\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7301\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tIllumiShare\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7301\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7302\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>IllumiShare enables remote people to share any physical or digital object on any surface. It is a low-cost, peripheral device that looks like a desk lamp, and just like a lamp lights up a surface at which it is pointed, IllumiShare shares a surface. To do this, IllumiShare uses a camera-projector pair where the camera captures video of the local workspace and sends it to the remote space and the projector projects video of the remote workspace onto the local space. With IllumiShare, people can sketch together using real ink and paper, remote meeting attendees can interact with conference room whiteboards, and children can have remote play dates in which they play with real toys. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/illumishare\/\">Learn more &gt;&gt;<\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/illumishare\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7304\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7304\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7303\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tLanguage-Learning Games on WP7 and Kinect\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7303\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7304\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Language-Learning Games on WP7 and Kinect is a language learning project focusing on how to facilitate delightful \u201cedutainment\u201d experiences across a range of Microsoft platforms:<\/p>\n<ul>\n<li><strong>SpatialEase:<\/strong> An Xbox 360 Kinect game for learning the language of space using \u201cembodied\u201d learning that connects language with thought and action. The learner must quickly interpret second-language commands, such as the translation of \u201cmove your left hand right,\u201d and move his or her body accordingly.<\/li>\n<li><strong>Tip Tap Tones:<\/strong> A Windows Phone game for learning Chinese sounds\u2014a highly effective mobile game for retraining the ears and the brain to perceive tonal Chinese syllables quickly and accurately.<\/li>\n<li><strong>Polyword Flashcards:<\/strong> Cloud flashcards with integrated skill-based games. Based on our adaptive learning algorithm, transferred to Bing Dictionary, we have created an HTML5 platform for deeply personalized learning that blends language study, gaming, and discovery. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/language-learning-games\/\">Learn more &gt;&gt;<\/a><\/li>\n<\/ul>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7306\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7306\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7305\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tMicrosoft Translator Hub: Translation by Everyone for Everyone\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7305\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7306\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Microsoft Translator Hub implements a self-service model for building a highly customized automatic translation service between any two languages. Microsoft Translator Hub empowers language communities, service providers and corporations to create automatic translation systems, allowing speakers of one language to share and access knowledge with speakers of any other language. By enabling translation to languages that aren\u2019t supported by today\u2019s mainstream translation engines, this also keeps less widely spoken languages vibrant and in use for future generations. This Azure based service allows users to upload language data for custom training, and then build and deploy custom translation models. These machine translation services are accessible using the Microsoft Translator APIs or a Webpage widget. <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" target=\"_blank\" href=\"http:\/\/hub.microsofttranslator.com\/SignIn?returnURL=\/Home\/Index\">Learn more &gt;&gt;<\/a><\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/microsoft-translator-hub-translation-by-everyone-for-everyone\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7308\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7308\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7307\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tNew Experiences in Search\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7307\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7308\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>This project explores ways for people to experience search that are complementary to fast, relevant search in response to queries. In particular, these concepts focus on new ways to spend time, rather than save time on the Web. Project components include:<\/p>\n<ul>\n<li>An organic kind of search that presents results which grow over time, drawing attention to the things you are most passionate about.<\/li>\n<li>A way of picturing and encapsulating search journeys so you can get pleasure from the voyage, as well as the destination.<\/li>\n<li>A way of packaging up search results so that they are collectable and can be given to others.<\/li>\n<li>As an ensemble, these demos emphasize self-expression and the creative use of search results over seeking and finding. They also focus on the importance of the search journey rather than the speed of delivering a result, recognizing that users often want to wander and explore the Web rather than quickly dip into it. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/new-experiences-in-search\/\">Learn more &gt;&gt;<\/a><\/li>\n<\/ul>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/new-experiences-in-search\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7310\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7310\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7309\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tTurn a Monolingual TTS into Mixed Language\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7309\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7310\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>A voice user interface needs to output responses in text-to-speech (TTS) synthesized speech. Sometimes, it is desirable to have the response in mixed languages; in a foreign country, it would be convenient if a car-navigation system not fluent in that particular foreign language could hear instructions in mixed codes\u2014entities such as street names synthesized in the local language and routing directions in the user\u2019s native language. The mixed-coded TTS can be built by a truly bilingual speaker, but it is usually difficult to find such a person. This project shows a new approach in turning monolingual TTS into a multilingual one. From a speaker\u2019s monolingual recordings, the algorithm can render speech sentences of different languages for building mixed-coded, bilingual TTS systems. Recordings of 26 languages are used to build the TTS of corresponding languages. By using this new approach, we can synthesize any mixed-language pair out of the 26 languages. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/turning-a-monolingual-speaker-into-multi-lingual-speaker\/\">Learn more &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7312\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7312\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7311\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tWearable Multitouch Projector\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7311\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7312\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>This project is a depth-sensing and projection system that enables interactive multitouch applications on everyday surfaces. Beyond a shoulder-worn system, there is no instrumentation of the user or the environment. Foremost, on such surfaces\u2014without calibration\u2014Wearable Multitouch Interaction provides capabilities similar to those of a mouse or a touchscreen: X and Y locations in 2-D interfaces and whether fingers are \u201cclicked\u201d or hovering, enabling a wide variety of interactions. Reliable operation on the hands, for example, requires buttons to be 2.3 centimeters in diameter. Thus, it is now conceivable that anything one can do on today\u2019s mobile devices can be done in the palm of a hand.<\/p>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/wearable-multitouch-projector\/\">Video &gt;&gt;<\/a><\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-7314\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-7314\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-7313\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tWhat\u2019s NUI? Explorations in Naturalness\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-7313\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-7314\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>The idea of natural user interfaces has motivated researchers to discover new modalities of interaction such as gesture, voice and touch. Through various demonstrations, we explore how one particular interaction mechanism, Kinect-based gestural interaction, can open up new experiences in different ways and in different contexts. In one demonstration, we show how a touchless system for 3D image use in vascular surgery requires a constrained space for gestural movements. In another, we show how Kinect technology can open up new interactions in the dark, for example helping us to \u2018feel\u2019 an invisible shape through sound feedback. Whether such demonstrations are natural is open for debate, but there is no doubt that these new user experiences can fire the imagination as to what the possibilities for interaction may be in future.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t\t\t\t\t<\/ul>\n\t<\/div>\n\t<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<!-- \/wp:freeform --><!-- \/wp:msr\/content-tab --><!-- \/wp:msr\/content-tabs -->","tab-content":[{"id":0,"name":"Summary","content":"Each winter, TechFest exposes Microsoft employees and guests to compelling research projects from Microsoft Research\u2019s labs around the world. Researchers demonstrate their most recent achievements\u2014and the technologies those efforts have produced. TechFest enables product teams and researchers to interact, often leading to the transfer of groundbreaking technologies into Microsoft products.\r\n\r\nExplore the projects. Watch the videos.\u00a0Immerse yourself in TechFest content, and see how today\u2019s future will become tomorrow\u2019s reality."},{"id":1,"name":"Projects","content":"Each year during TechFest, Microsoft Research displays a collection of cutting-edge research projects that offer new functionalities for Microsoft products and, often, for the greater research ecosystem. Many of those projects are discussed below.\r\n\r\n[accordion]\r\n\r\n[panel header=\"A Natural UI for Polling Students in a Classroom\"]\r\n\r\nOne of the biggest challenges facing teachers in a classroom is to gauge whether students are keeping up with the lesson. This challenge is especially acute in distance-education programs, because of the physical separation between students and teachers. This project delivers a new, low-cost technique for instantly polling students in the classroom. The approach enables teachers to ask a multiple-choice question to the class. Students respond by holding up a sheet of paper, which has a printed code, similar to a QR code, which encodes their answers, as well as their student IDs. A webcam automatically recognizes the response and using computer vision technology aggregates the responses for immediate evaluation by the teacher. Initial trials in schools in Bangalore, India show the system is as accurate as a written test, as fast as a show of hands, and is at least 10 times cheaper than alternative electronic solutions.\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Applied Sciences Group: Interactive Displays\"]\r\n<ul>\r\n \t<li><strong>Telepresence using Wedge Technology:<\/strong> Glassless 3-D display with a correct camera pose and view pose for a live view-dependent 3-D Window Telepresence experience.<\/li>\r\n \t<li><strong>Behind the Screen Overlay Interactions:<\/strong> Behind-the-screen interaction with a transparent OLED with view-dependent, depth-corrected gaze.<\/li>\r\n \t<li><strong>Seeing Displays:<\/strong> Uses flat lenses (wedge) to see through a semi-transparent OLED for novel above screen gesture and scanning interactions.<\/li>\r\n \t<li><strong>High-Performance Touch:<\/strong> A touch-display system with two orders of magnitude less latency than current systems.<\/li>\r\n \t<li><strong>Mayhem:<\/strong> A freely available, open source Windows application that lets almost anyone use their computer to do stuff automatically across all their devices. Just select an event (e.g. your favorite stock hit a trigger value, a change in the weather, say something to your Kinect, etc.) and then select a reaction (e.g. advance a PowerPoint slide, turn on a lamp, start playing a movie, etc.), and within seconds, you have a connection running.<\/li>\r\n<\/ul>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/lab\/applied-sciences-group\/\" target=\"_blank\">Learn more &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Automatic Building Parsing in Urban Areas\"]\r\n\r\nIntensive interest exists in intelligent online 3-D exploration and navigation in urban areas. Thus, understanding the 3-D structures of urban areas from captured images or videos becomes indispensable. Automatic Building Parsing in Urban Areas is a tool that can automatically detect the fa\u00e7ades in a single image or in multiple images. Beyond locating the position, it also can compute the geometry of each fa\u00e7ade\u2014the orientation of the plane\u2014 without human interaction. The tool is implemented using our newly developed Robust Principal Component Analysis SDK and requires little response time. The interactive speed is demonstrated by a touch-based application that lets users dive into a 3-D tour of an urban area from a single image. This tool directly benefits the navigation and smooth transition of bird\u2019s-eye-view images and could become a fundamental tool for many applications in urban areas. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/candidate-talk-matching-and-3d-reconstruction-in-urban-environments\/\">Learn more &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Automatic Text Pop-Up for Web Images\"]\r\n\r\nThe Bing home page provides teaser captions for an interesting image in the form of Bing tiles. The images are chosen carefully and the captions are written by a person to make it interesting. The Automatic \u201cText Pop-Up\u201d for Web Images application automatically generates similar text descriptions for a large fraction of the most popular images on the web. At the core of the system is an offline text-extraction process, in which the application mines the web for meaningful captions that relate to a given image. During this, the application checks sentence semantics for relevancy, diversity, and optimal structure, and performs content filtering. The results are indexed in a database. The front end of the application is integrated into the Bing Toolbar in IE. Whenever a user navigates to a webpage, the application queries the database and overlays text descriptions for the images on the webpage in the form of Bing tiles: the text pop-up.\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Beamatron\"]\r\n\r\nBeamatron is a new, augmented-reality concept that combines a projector and a Kinect camera on a pan-tilt moving head. The moving head is used to place the projected image almost anywhere in a room. Meanwhile, the depth camera enables the correct warping of the displayed image for the shape of the projection surface and for the projected graphics to react in physically appropriate ways. For example, a projected virtual car can be driven on the floor of the room but will bump into obstacles or run over ramps. As another application, we consider the ability to bring notifications and other graphics to the attention of the user by automatically placing the graphics within the user\u2019s view.\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/beamatron\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Bing-Enabled Azure Data Services for Enterprises\"]\r\n\r\nThe goal of this project is to leverage the opportunity for enterprise applications to significantly benefit in novel ways from our Bing data assets, specifically, query logs, web crawl and social media data. This project illustrates the progress made so far. It identifies key Azure data services that have the potential to be widely useful for enterprises by leveraging the combination of Bing data assets, the Microsoft cloud computing infrastructure and deep data analytics. To bring home the opportunities, the project shows how Microsoft\u2019s enterprise software can leverage these data services, and illustrates Bing-enabled enhancements that SharePoint Search and Microsoft Office products and services can potentially leverage.\r\n\r\n[\/panel]\r\n\r\n[panel header=\"ChronoZoom: Big History with Big Data\"]\r\n\r\nThere are thousands of digital libraries, archives, collections and repositories and no easy way to find these datasets for teaching, learning and research. To truly bridge humanities and sciences and pull them out of their silos we need a dynamic cloud based data visualization tool where educators, researchers and students can easily consume, compare and understand the history of the cosmos, earth, life and humanity. Where they can easily consume rich media sets like: audio, video, text, pdfs, charts, graphs and articles in one place and discover new possibilities.\r\n\r\nChronoZoom will enable:\r\n<ul>\r\n \t<li>Transitioning effortlessly between scales of one year to billions of years.<\/li>\r\n \t<li>Putting historical episodes, events, and trends in context without sacrificing precision.<\/li>\r\n \t<li>Comparing vast amounts of time-related data across different fields and disciplines.<\/li>\r\n \t<li>Gaining insight and the ability to shape the future by better understanding the cause-and-effect interplay between disciplines.<\/li>\r\n<\/ul>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/chronozoom-an-infinite-canvas-in-time\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Cliplets: Juxtaposing Still and Dynamic Imagery\"]\r\n\r\nA still photograph is a limited format for capturing moments that span an interval of time. Video is the traditional method for recording durations of time, but the subjective \u201cmoment\u201d that one desires to capture is often lost in the chaos of shaky camerawork, irrelevant background clutter, and noise that dominates most casually recorded video clips. This work provides a creative lens used to focus on important aspects of a moment by performing spatiotemporal compositing and editing on video-clip input. This is an interactive app that uses semi-automated methods to give users the power to create \u201ccliplets\u201d\u2014a type of imagery that sits between stills and video from handheld videos. <a href=\"http:\/\/research.microsoft.com\/en-us\/um\/redmond\/projects\/cliplets\/\">Learn more &gt;&gt;<\/a>\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/cliplets-juxtaposing-still-and-dynamic-imagery-2\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"FetchClimate! Building a Geographical Web Service\"]\r\n\r\nA huge amount of climate data is available, covering the whole of the Earth surface. But even the experts find it ludicrously difficult to get the climate information they need: locate data sets, negotiate permissions, download huge files, make sense of file formats, get to grips with yet another library, filter, interpolate, regrid, etc! Enter FetchClimate, a fast, intelligent climate-data-retrieval service that operates over Windows Azure. FetchClimate can be used through a Silverlight web interface or from inside any .NET program. FetchClimate works at any grid resolution from global to a few kilometers, in a range of years from 1900 to 2010, on days within a year, and for hours within a day. When multiple data sources could answer your query, FetchClimate automatically selects the most appropriate, returning the requested values along with the level of uncertainty and the origin of the data. The entire query can be shared as a single URL, enabling others to retrieve the identical information. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/fetchclimate\/\">Learn more &gt;&gt;<\/a>\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/fetchclimate-building-a-geographical-web-service\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Gesture Recognition with Next-Generation Webcam\"]\r\n\r\nThis project presents next-generation webcam hardware and software prototypes. The new prototype webcam has an extremely wider view angle than traditional webcams and can capture stereo movie and high-accuracy depth images simultaneously. Users can chat with stereoscopic video. Accurate depth-image processing can support not only all Kinect scenarios on a PC, but also a gesture-control user interface without a touch screen. Besides computer vision, the webcam includes a hardware accelerator and a new image-sensor design. The cost of the design is similar to that of current webcams, and the webcam potentially could be miniaturized as a mobile camera. The project showcases new user scenarios in playing games with this webcam.\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/gesture-recognition-with-next-generation-webcam-2\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"High-Fidelity Facial-Animation Capturing\"]\r\n\r\nHigh-Fidelity Facial-Animation Capturing presents a new approach for acquiring high-\ufb01delity, 3-D facial performances with realistic dynamic wrinkles and \ufb01nely scaled facial details. This approach leverages state-of-the-art motion-capture technology and advanced 3-D scanning technology for facial-performance acquisition. The system can capture facial performances that match both the spatial resolution of static face scans and the acquisition speed of motion-capture systems.\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/high-fidelity-facial-animation-capturing\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Holoflector\"]\r\n\r\nHoloflector is a unique, interactive augmented-reality mirror. Graphics are superimposed correctly on your own reflection to enable an augmented-reality experience unlike anything you have seen before. It also leverages the combined abilities of Kinect and Windows Phone to infer the position of your phone and render graphics that seem to hover above it.\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/holoflector\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"IllumiShare\"]\r\n\r\nIllumiShare enables remote people to share any physical or digital object on any surface. It is a low-cost, peripheral device that looks like a desk lamp, and just like a lamp lights up a surface at which it is pointed, IllumiShare shares a surface. To do this, IllumiShare uses a camera-projector pair where the camera captures video of the local workspace and sends it to the remote space and the projector projects video of the remote workspace onto the local space. With IllumiShare, people can sketch together using real ink and paper, remote meeting attendees can interact with conference room whiteboards, and children can have remote play dates in which they play with real toys. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/illumishare\/\">Learn more &gt;&gt;<\/a>\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/illumishare\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Language-Learning Games on WP7 and Kinect\"]\r\n\r\nLanguage-Learning Games on WP7 and Kinect is a language learning project focusing on how to facilitate delightful \u201cedutainment\u201d experiences across a range of Microsoft platforms:\r\n<ul>\r\n \t<li><strong>SpatialEase:<\/strong> An Xbox 360 Kinect game for learning the language of space using \u201cembodied\u201d learning that connects language with thought and action. The learner must quickly interpret second-language commands, such as the translation of \u201cmove your left hand right,\u201d and move his or her body accordingly.<\/li>\r\n \t<li><strong>Tip Tap Tones:<\/strong> A Windows Phone game for learning Chinese sounds\u2014a highly effective mobile game for retraining the ears and the brain to perceive tonal Chinese syllables quickly and accurately.<\/li>\r\n \t<li><strong>Polyword Flashcards:<\/strong> Cloud flashcards with integrated skill-based games. Based on our adaptive learning algorithm, transferred to Bing Dictionary, we have created an HTML5 platform for deeply personalized learning that blends language study, gaming, and discovery. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/language-learning-games\/\">Learn more &gt;&gt;<\/a><\/li>\r\n<\/ul>\r\n[\/panel]\r\n\r\n[panel header=\"Microsoft Translator Hub: Translation by Everyone for Everyone\"]\r\n\r\nMicrosoft Translator Hub implements a self-service model for building a highly customized automatic translation service between any two languages. Microsoft Translator Hub empowers language communities, service providers and corporations to create automatic translation systems, allowing speakers of one language to share and access knowledge with speakers of any other language. By enabling translation to languages that aren\u2019t supported by today\u2019s mainstream translation engines, this also keeps less widely spoken languages vibrant and in use for future generations. This Azure based service allows users to upload language data for custom training, and then build and deploy custom translation models. These machine translation services are accessible using the Microsoft Translator APIs or a Webpage widget. <a href=\"http:\/\/hub.microsofttranslator.com\/SignIn?returnURL=\/Home\/Index\">Learn more &gt;&gt;<\/a>\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/microsoft-translator-hub-translation-by-everyone-for-everyone\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"New Experiences in Search\"]\r\n\r\nThis project explores ways for people to experience search that are complementary to fast, relevant search in response to queries. In particular, these concepts focus on new ways to spend time, rather than save time on the Web. Project components include:\r\n<ul>\r\n \t<li>An organic kind of search that presents results which grow over time, drawing attention to the things you are most passionate about.<\/li>\r\n \t<li>A way of picturing and encapsulating search journeys so you can get pleasure from the voyage, as well as the destination.<\/li>\r\n \t<li>A way of packaging up search results so that they are collectable and can be given to others.<\/li>\r\n \t<li>As an ensemble, these demos emphasize self-expression and the creative use of search results over seeking and finding. They also focus on the importance of the search journey rather than the speed of delivering a result, recognizing that users often want to wander and explore the Web rather than quickly dip into it. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/new-experiences-in-search\/\">Learn more &gt;&gt;<\/a><\/li>\r\n<\/ul>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/new-experiences-in-search\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Turn a Monolingual TTS into Mixed Language\"]\r\n\r\nA voice user interface needs to output responses in text-to-speech (TTS) synthesized speech. Sometimes, it is desirable to have the response in mixed languages; in a foreign country, it would be convenient if a car-navigation system not fluent in that particular foreign language could hear instructions in mixed codes\u2014entities such as street names synthesized in the local language and routing directions in the user\u2019s native language. The mixed-coded TTS can be built by a truly bilingual speaker, but it is usually difficult to find such a person. This project shows a new approach in turning monolingual TTS into a multilingual one. From a speaker\u2019s monolingual recordings, the algorithm can render speech sentences of different languages for building mixed-coded, bilingual TTS systems. Recordings of 26 languages are used to build the TTS of corresponding languages. By using this new approach, we can synthesize any mixed-language pair out of the 26 languages. <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/turning-a-monolingual-speaker-into-multi-lingual-speaker\/\">Learn more &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"Wearable Multitouch Projector\"]\r\n\r\nThis project is a depth-sensing and projection system that enables interactive multitouch applications on everyday surfaces. Beyond a shoulder-worn system, there is no instrumentation of the user or the environment. Foremost, on such surfaces\u2014without calibration\u2014Wearable Multitouch Interaction provides capabilities similar to those of a mouse or a touchscreen: X and Y locations in 2-D interfaces and whether fingers are \u201cclicked\u201d or hovering, enabling a wide variety of interactions. Reliable operation on the hands, for example, requires buttons to be 2.3 centimeters in diameter. Thus, it is now conceivable that anything one can do on today\u2019s mobile devices can be done in the palm of a hand.\r\n\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/wearable-multitouch-projector\/\">Video &gt;&gt;<\/a>\r\n\r\n[\/panel]\r\n\r\n[panel header=\"What\u2019s NUI? Explorations in Naturalness\"]\r\n\r\nThe idea of natural user interfaces has motivated researchers to discover new modalities of interaction such as gesture, voice and touch. Through various demonstrations, we explore how one particular interaction mechanism, Kinect-based gestural interaction, can open up new experiences in different ways and in different contexts. In one demonstration, we show how a touchless system for 3D image use in vascular surgery requires a constrained space for gestural movements. In another, we show how Kinect technology can open up new interactions in the dark, for example helping us to \u2018feel\u2019 an invisible shape through sound feedback. Whether such demonstrations are natural is open for debate, but there is no doubt that these new user experiences can fire the imagination as to what the possibilities for interaction may be in future.\r\n\r\n[\/panel]\r\n\r\n[\/accordion]"}],"msr_startdate":"2012-03-06","msr_enddate":"2012-03-06","msr_event_time":"","msr_location":"Redmond, WA, U.S.","msr_event_link":"","msr_event_recording_link":"","msr_startdate_formatted":"March 6, 2012","msr_register_text":"Watch now","msr_cta_link":"","msr_cta_text":"","msr_cta_bi_name":"","featured_image_thumbnail":null,"event_excerpt":"Each winter, TechFest exposes Microsoft employees and guests to compelling research projects from Microsoft Research\u2019s labs around the world. Researchers demonstrate their most recent achievements\u2014and the technologies those efforts have produced. TechFest enables product teams and researchers to interact, often leading to the transfer of groundbreaking technologies into Microsoft products.","msr_research_lab":[199565],"related-researchers":[],"msr_impact_theme":[],"related-academic-programs":[],"related-groups":[],"related-projects":[],"related-opportunities":[],"related-publications":[],"related-videos":[187434,187554],"related-posts":[304346],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/199780","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-event"}],"version-history":[{"count":2,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/199780\/revisions"}],"predecessor-version":[{"id":1147438,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/199780\/revisions\/1147438"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=199780"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=199780"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=199780"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=199780"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=199780"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=199780"},{"taxonomy":"msr-program-audience","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-program-audience?post=199780"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=199780"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=199780"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}