{"id":379364,"date":"2017-04-26T17:48:02","date_gmt":"2017-04-27T00:48:02","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&#038;p=379364"},"modified":"2018-07-24T10:30:52","modified_gmt":"2018-07-24T17:30:52","slug":"pen-touch-interaction","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/pen-touch-interaction\/","title":{"rendered":"Pen + Touch Interaction"},"content":{"rendered":"<p>Microsoft Research has\u00a0long pioneered new techniques in digital pen input, including particularly\u00a0its combination with multi-touch in a manner that reflects how people naturally use their hands.<\/p>\n<p>For example, in the real world people hold (\u201ctouch\u201d) a document with their non-preferred hand, and often frame a particular region of interest between thumb and forefinger. The pen (held, of course, in the preferred hand) then marks up the page.<\/p>\n<p>Interfaces with such\u00a0\u201cPen + Touch\u201d capabilities are now deeply integrated with Windows and\u00a0distinguish Microsoft\u2019s innovative line of Surface products, including the Surface Pro tablet, Surface Studio drafting table, and the Surface Hub electronic whiteboard.<\/p>\n<div id=\"attachment_379379\" style=\"width: 722px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-379379\" class=\"size-full wp-image-379379\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/natural-behavior-framing.jpg\" alt=\"Natural behavior of framing part of the page and writing in reference to it\" width=\"712\" height=\"412\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/natural-behavior-framing.jpg 712w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/natural-behavior-framing-300x174.jpg 300w\" sizes=\"auto, (max-width: 712px) 100vw, 712px\" \/><p id=\"caption-attachment-379379\" class=\"wp-caption-text\">This example shows a common natural behavior while interacting with paper documents or notebooks in the real world. Here one hand &#8216;frames&#8217; the portion of the page the individual is working on, while the other hand marks in reference to this area with the pen.<\/p><\/div>\n<hr \/>\n<div id=\"attachment_379385\" style=\"width: 725px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-379385\" class=\"size-full wp-image-379385\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/Crayon-and-Touch-x715-1.jpg\" alt=\"Crayon and touch with both hands\" width=\"715\" height=\"402\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/Crayon-and-Touch-x715-1.jpg 715w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/Crayon-and-Touch-x715-1-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/Crayon-and-Touch-x715-1-343x193.jpg 343w\" sizes=\"auto, (max-width: 715px) 100vw, 715px\" \/><p id=\"caption-attachment-379385\" class=\"wp-caption-text\">Another example is orienting a page with one hand while manipulating the writing implement with the other, as in this example of &#8220;crayon and touch&#8221; technology from everyday life.<\/p><\/div>\n<hr \/>\n<p>&nbsp;<\/p>\n<div id=\"attachment_379394\" style=\"width: 510px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-379394\" class=\"size-full wp-image-379394\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-1-x500.png\" alt=\"Leveraging the full capabilities of both hands.\" width=\"500\" height=\"293\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-1-x500.png 500w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-1-x500-300x176.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-1-x500-480x280.png 480w\" sizes=\"auto, (max-width: 500px) 100vw, 500px\" \/><p id=\"caption-attachment-379394\" class=\"wp-caption-text\">Our work stems from a desire to understand and leverage the full capabilities of both hands.<\/p><\/div>\n<div id=\"attachment_379397\" style=\"width: 510px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-379397\" class=\"size-full wp-image-379397\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-2-x500.jpg\" alt=\"Pen and Touch as complementary input modalities\" width=\"500\" height=\"368\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-2-x500.jpg 500w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-2-x500-300x221.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/pen-touch-vision-2-x500-80x60.jpg 80w\" sizes=\"auto, (max-width: 500px) 100vw, 500px\" \/><p id=\"caption-attachment-379397\" class=\"wp-caption-text\">&#8230;While simultaneously leveraging the mutual affordances of both Pen and Touch as input modalities.<\/p><\/div>\n<hr \/>\n<p>This page collects a number of the people and publications that contribute to this important and still-emerging aspect of &#8216;modern&#8217; interaction with devices.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Microsoft Research has\u00a0long pioneered new techniques in digital pen input, including particularly\u00a0its combination with multi-touch in a manner that reflects how people naturally use their hands. For example, in the real world people hold (\u201ctouch\u201d) a document with their non-preferred hand, and often frame a particular region of interest between thumb and forefinger. The pen [&hellip;]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13554],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-379364","msr-project","type-msr-project","status-publish","hentry","msr-research-area-human-computer-interaction","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"2010-04-10","related-publications":[314696,379544,379304,379298,315437,315245,315212,315173,315065,314828,314711,145424,314564,312557,312497,312470,312464,304085,303908,286127,168387,161211,151375],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Michel Pahud","user_id":33007,"people_section":"Group 1","alias":"mpahud"}],"msr_research_lab":[199565],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/379364","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":2,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/379364\/revisions"}],"predecessor-version":[{"id":497030,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/379364\/revisions\/497030"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=379364"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=379364"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=379364"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=379364"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=379364"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}