{"id":379298,"date":"2017-04-26T14:09:12","date_gmt":"2017-04-26T21:09:12","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&#038;p=379298"},"modified":"2018-10-16T22:03:23","modified_gmt":"2018-10-17T05:03:23","slug":"writlarge-ink-unleashed-unified-scope-action-zoom","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/writlarge-ink-unleashed-unified-scope-action-zoom\/","title":{"rendered":"WritLarge: Ink Unleashed by Unified Scope, Action, & Zoom"},"content":{"rendered":"<p><em>WritLarge<\/em> is a freeform canvas for early-stage design on electronic whiteboards with pen+touch input. The system aims to support a higher-level flow of interaction by \u2018chunking\u2019 the traditionally disjoint steps of <em>selection<\/em> and <em>action<\/em> into unified <em>selection-action phrases<\/em>. This holistic goal led us to address two complementary aspects:<\/p>\n<ul>\n<li>SELECTION, for which we devise a new technique known as the <em>Zoom-Catcher<\/em> that integrates pinch-to-zoom and selection in a single gesture for fluidly selecting and acting on content;<\/li>\n<\/ul>\n<p>plus:<\/p>\n<ul>\n<li>ACTION, where we demonstrate how this addresses the combined issues of navigating, selecting, and manipulating content. In particular, the designer can transform select ink strokes in flexible and easily-reversible representations via <em>semantic<\/em>, <em>structural<\/em>, and <em>temporal<\/em> axes of movement that are defined as conceptual \u2018moves\u2019 relative to the specified content.<\/li>\n<\/ul>\n<p>This approach dovetails zooming with lightweight specification of scope as well as the evocation of context-appropriate commands, at-hand, in a location-independent manner. This establishes powerful new primitives that can help to scaffold higher-level tasks, thereby unleashing the expressive power of ink in a compelling manner.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>WritLarge is a freeform canvas for early-stage design on electronic whiteboards with pen+touch input. The system aims to support a higher-level flow of interaction by \u2018chunking\u2019 the traditionally disjoint steps of selection and action into unified selection-action phrases. This holistic goal led us to address two complementary aspects: SELECTION, for which we devise a new [&hellip;]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_publishername":"ACM","msr_publisher_other":"","msr_booktitle":"","msr_chapter":"","msr_edition":"Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17)","msr_editors":"","msr_how_published":"","msr_isbn":"","msr_issue":"","msr_journal":"","msr_number":"","msr_organization":"","msr_pages_string":"","msr_page_range_start":"","msr_page_range_end":"","msr_series":"","msr_volume":"","msr_copyright":"","msr_conference_name":"Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17)","msr_doi":"","msr_arxiv_id":"","msr_s2_paper_id":"","msr_mag_id":"","msr_pubmed_id":"","msr_other_authors":"","msr_other_contributors":"","msr_speaker":"","msr_award":"Honorable Mention","msr_affiliation":"","msr_institution":"","msr_host":"","msr_version":"","msr_duration":"","msr_original_fields_of_study":"","msr_release_tracker_id":"","msr_s2_match_type":"","msr_citation_count_updated":"","msr_published_date":"2017-05-09","msr_highlight_text":"","msr_notes":"Honorable Mention","msr_longbiography":"","msr_publicationurl":"","msr_external_url":"","msr_secondary_video_url":"","msr_conference_url":"","msr_journal_url":"","msr_s2_pdf_url":"","msr_year":0,"msr_citation_count":0,"msr_influential_citations":0,"msr_reference_count":0,"msr_s2_match_confidence":0,"msr_microsoftintellectualproperty":true,"msr_s2_open_access":false,"msr_s2_author_ids":[],"msr_pub_ids":[],"msr_hide_image_in_river":0,"footnotes":""},"msr-research-highlight":[],"research-area":[13554],"msr-publication-type":[193716],"msr-publisher":[],"msr-focus-area":[],"msr-locale":[268875],"msr-post-option":[],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-379298","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_publishername":"ACM","msr_edition":"Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17)","msr_affiliation":"","msr_published_date":"2017-05-09","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"Honorable Mention","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"379301","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"file","title":"WritLarge-CHI-2017","viewUrl":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2017\/04\/WritLarge-CHI-2017.pdf","id":379301,"label_id":0}],"msr_related_uploader":"","msr_citation_count":0,"msr_citation_count_updated":"","msr_s2_paper_id":"","msr_influential_citations":0,"msr_reference_count":0,"msr_arxiv_id":"","msr_s2_author_ids":[],"msr_s2_open_access":false,"msr_s2_pdf_url":null,"msr_attachments":[],"msr-author-ordering":[{"type":"text","value":"Haijun Xia","user_id":0,"rest_url":false},{"type":"user_nicename","value":"kenh","user_id":32521,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=kenh"},{"type":"user_nicename","value":"mpahud","user_id":33007,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=mpahud"},{"type":"text","value":"Xiao Tu","user_id":0,"rest_url":false},{"type":"user_nicename","value":"bibuxton","user_id":31224,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=bibuxton"}],"msr_impact_theme":[],"msr_research_lab":[],"msr_event":[],"msr_group":[371909,379814],"msr_project":[379322,379364],"publication":[],"video":[],"msr-tool":[],"msr_publication_type":"inproceedings","related_content":{"projects":[{"ID":379322,"post_title":"WritLarge","post_name":"writlarge","post_type":"msr-project","post_date":"2017-04-26 16:42:19","post_modified":"2018-07-24 10:23:18","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/writlarge\/","post_excerpt":"WritLarge is a prototype system from Microsoft Research for the 84\" Microsoft Surface Hub, a large electronic whiteboard supporting both pen and multi-touch input. WritLarge allows\u00a0creators to unleash the latent expressive power of ink in a compelling manner. Using multi-touch, the content creator can simply frame a portion of their \u2018whiteboard\u2019 session between thumb and forefinger, and then act on such a selection (such as by copying, sharing, organizing, or otherwise transforming the content) using&hellip;","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/379322"}]}},{"ID":379364,"post_title":"Pen + Touch Interaction","post_name":"pen-touch-interaction","post_type":"msr-project","post_date":"2017-04-26 17:48:02","post_modified":"2018-07-24 10:30:52","post_status":"publish","permalink":"https:\/\/www.microsoft.com\/en-us\/research\/project\/pen-touch-interaction\/","post_excerpt":"Microsoft Research has\u00a0long pioneered new techniques in digital pen input, including particularly\u00a0its combination with multi-touch in a manner that reflects how people naturally use their hands. For example, in the real world people hold (\u201ctouch\u201d) a document with their non-preferred hand, and often frame a particular region of interest between thumb and forefinger. The pen (held, of course, in the preferred hand) then marks up the page. Interfaces with such\u00a0\u201cPen + Touch\u201d capabilities are now&hellip;","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/379364"}]}}]},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/379298","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":3,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/379298\/revisions"}],"predecessor-version":[{"id":541661,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/379298\/revisions\/541661"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=379298"}],"wp:term":[{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=379298"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=379298"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=379298"},{"taxonomy":"msr-publisher","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publisher?post=379298"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=379298"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=379298"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=379298"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=379298"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=379298"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=379298"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=379298"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=379298"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}