{"id":577425,"date":"2019-04-17T11:09:17","date_gmt":"2019-04-17T18:09:17","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=577425"},"modified":"2019-04-29T12:46:25","modified_gmt":"2019-04-29T19:46:25","slug":"prototype-tablet-tricked-out-with-sensors-just-proves-mom-was-always-right-posture-is-important","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/prototype-tablet-tricked-out-with-sensors-just-proves-mom-was-always-right-posture-is-important\/","title":{"rendered":"Prototype tablet tricked out with sensors just proves Mom was always right: Posture is important!"},"content":{"rendered":"<p><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/kenhinckley.files.wordpress.com\/2019\/04\/posture-aware-paper_site_03_20_19_1400x788.gif\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-578191 size-large\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-1024x576.png\" alt=\"\" width=\"1024\" height=\"576\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-1024x576.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-300x169.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-768x432.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-1066x600.png 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-655x368.png 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-343x193.png 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage.png 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/p>\n<p>The mobility of tablets affords interaction from a wide diversity of <em>postures:<\/em><\/p>\n<p>Hunched over a desk with brow furrowed in concentration. On the go with the tablet gripped in one hand, while operating it with the other. Or kicked back on a couch to relax with some good old-fashioned <em>Cat vs. Laser Pointer<\/em> internet-video action.<\/p>\n<p>This dexterity of situation, task, and mood are a big part of what makes tablets so appealing.<\/p>\n<p>But as the scenarios above illustrate, there\u2019s another crucial point: the nature of the activity changes as we move between these contexts.<\/p>\n<p>And these changes are mirrored by the user\u2019s physical posture\u2014how they sit, how closely they engage with the device, which hand holds and which hand manipulates, all the way down to the fine-grained details of how the user grasps the tablet\u2019s bezel between thumb and forefinger.<\/p>\n<p>That\u2019s where the <em>Posture-Aware Interface<\/em> comes in. Because applications can transform in compelling ways by directly sensing how people hold and manipulate their tablets\u2014and interact with them using multi-touch and digital pen inputs.<\/p>\n<div id=\"attachment_577431\" style=\"width: 1034px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-1-posture-aware-interface.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-577431\" class=\"wp-image-577431 size-large\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-1-posture-aware-interface-1024x836.png\" alt=\"\" width=\"1024\" height=\"836\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-1-posture-aware-interface-1024x836.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-1-posture-aware-interface-300x245.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-1-posture-aware-interface-768x627.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-1-posture-aware-interface.png 1431w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><p id=\"caption-attachment-577431\" class=\"wp-caption-text\">Figure 1 \u2013 The Posture-Aware Interface adapts to many nuances of how a person holds, touches, or marks up content on their tablet.<\/p><\/div>\n<p>Figure 1 above shows a simple application for annotation and mark-up that responds to various contexts sensed by our system:<\/p>\n<ul>\n<li>Grasping the tablet summons Thumb Tools to a grip-centric location nearby, such as at the left (Panel A) or bottom (Panel B) edges of the screen.<\/li>\n<li>Putting the tablet flat on a desk reverts to a more standard type of toolbar, at a device-centric position near the upper right (Panel C).<\/li>\n<li>Planting the preferred hand on the screen to write automatically orients miniature Palm Tools (Panel D) to a convenient hand-centric location.<\/li>\n<li>Or laying the pen down on the screen directly accesses its settings for customization (Panel E).<\/li>\n<li>Finally, if you set fingers of your left hand to the screen, Fan Tools appear with more options\u2014but note how they\u2019re splayed out to the right, for easy access by the pen (Panel F).<\/li>\n<li>But if you take the same action with the right hand, the system directly senses this and splays them out to the left instead (Panel G).<\/li>\n<\/ul>\n<p>These all are simple adaptations driven by sensors that detect how the user postures, grips, or reaches for the device. And the interface dynamically morphs in response, with smooth animations that make everything feel fluid and responsive.<\/p>\n<p>Indeed, by being acutely aware of these fine details of context, the Posture-Aware Interface comes to grips with our half-formed thoughts. This is not mind reading, but when the device anticipates your every move, it can certainly feel that way at times.<\/p>\n<p>But it works because how people hold a device, and reach for the screen, reveals their tacit intentions\u2014much in the way that a poker player might give away an all-too-revealing tell by how convincingly they lay down their bet.<\/p>\n<h3>Realizing posture-Aware sensing (a.k.a., getting our hands dirty with some cool hardware)<\/h3>\n<p>All right then, since it\u2019s not powered by card sharks, how does the Posture-Aware Interface achieve this acuteness of observation?<\/p>\n<p>The system combines three distinct sensing modalities:<\/p>\n<ol>\n<li>the full image sensed by the touchscreen<\/li>\n<li>the tilt and motion of the device<\/li>\n<li>the capacitive field surrounding the device itself<\/li>\n<\/ol>\n<p>The first element uses the image of your hand resting on the touchscreen to reason about what\u2019s going on. In principle this is just like a preschooler exuberantly mashing their paint-laden fingers onto (digital) construction paper. In practice, modern touchscreen interfaces derive entirely from individual finger-contact events\u2014and in the process lose much of the joy and expressiveness of painting with our entire hands in childhood. But by going back to the drawing board (so to speak), our approach brings aspects like hand contact\u2014simply being able to detect your palm resting on the screen while writing\u2014back into the vocabulary of touch-screen interaction.<\/p>\n<p>The second element uses the accelerometer and gyroscope sensors built into modern devices to understand the angle of the screen, and how it\u2019s moving (or not moving). The established use for such sensors is automatic screen rotation\u2014which just happens to be another innovation introduced by Microsoft Research, some 20 years ago. But in the present project we use the sensors to reason about stationary versus mobile use of tablets, allowing graceful transitions between many different physical postures of engagement with a device. And indeed, we can now use the added context of grip to suppress auto-rotation when you lay down on a couch.<\/p>\n<p>The third element is perhaps where the real magic is. We built a custom electrode ring\u2014essentially, a series of about 50 copper tape segments under the bezel of the screen\u2014that can detect the capacitance of an approaching hand.<\/p>\n<p>That\u2019s right, the Posture-Aware Interface can sense and respond to your hand even before it touches down on the screen.<\/p>\n<p>It also can separately detect when your hand actually grasps the screen bezel, so that we can tell exactly where you\u2019re gripping it.<\/p>\n<p>Or it can combine the two modes, such as when you\u2019re gripping it with one hand, and touching the screen with the other. In this case, the sensor can further detect which direction your arm is reaching from because your forearm passes over the electrode ring.<\/p>\n<p>More formally, this super-sensory get-up is known as a Peripheral Electric Field Sensor, if you really want to impress guests at your next dinner party with some arcane capacitance-sensing terminology. And it\u2019s driven by the custom circuitry illustrated below.<\/p>\n<div id=\"attachment_577437\" style=\"width: 1034px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-2-posture-aware-interface.png.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-577437\" class=\"wp-image-577437 size-large\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-2-posture-aware-interface.png-1024x561.jpg\" alt=\"\" width=\"1024\" height=\"561\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-2-posture-aware-interface.png-1024x561.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-2-posture-aware-interface.png-300x164.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-2-posture-aware-interface.png-768x421.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/figure-2-posture-aware-interface.png.jpg 1429w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><p id=\"caption-attachment-577437\" class=\"wp-caption-text\">Figure 2 \u2013 The Peripheral Electric Field Sensor consists of an electrode ring and custom circuitry that can detect when your hands approach or grip the tablet.<\/p><\/div>\n<p>All of this may sound quite fancy, but indeed one of the reasons we chose these three sensor elements was that we believe they are all amenable to practical integration with consumer tablets. Indeed, the first two elements are already widely available\u2014the innovation here is simply to combine these to gain awareness of the posture of the device relative to the user, and then to leverage these insights to drive contextually-appropriate adaptations.<\/p>\n<h3>Come and experience it for yourself<\/h3>\n<p>The Posture-Aware Interface will be presented in full scientific depth at the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/chi2019.acm.org\/\">ACM CHI 2019 Conference on Human Factors in Computing Systems in Glasgow<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, where it received an Honorable Mention for best paper award. Check out \u201c<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/sensing-posture-aware-pentouch-interaction-on-tablets\/\">Sensing Posture -Aware Pen + Touch Interaction on Tablets<\/a>&#8221; for further details.<\/p>\n<p>The research was conducted by the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/aka.ms\/epic-research\">EPIC (Extended Perception, Interaction, and Cognition) group at Microsoft Research<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> in Redmond, Washington, a team that innovates at the nexus of hardware, software, and human potential.<\/p>\n<p>Research Intern Yang Zhang, who hails from Carnegie Mellon University, led the project (and exhibited unparalleled hardware wizardry!) during his time at Microsoft Research. Other contributors include Michel Pahud, Christian Holz, Haijun Xia, Gierad Laput, Michael McGuffin, Xiao Tu, Andrew Mittereder, Fei Su, William Buxton, and Ken Hinckley.<\/p>\n<h3>Posing thoughts<\/h3>\n<p>Overall, our work on the Posture-Aware Interface demonstrates how posture awareness can adapt interaction and morph user interface elements to suit the fine-grained context of use for pen and touch interaction on tablets.<\/p>\n<p>Posture awareness includes the ability to sense grip, the angle of the tablet, the presence and orientation of the palm on the screen while writing or sketching, and the direction of reach during touch. And our work shows how just a few simple sensors can achieve this\u2014enabling tablets to more effectively support both mobile and stationary use, and the many postural nuances in-between.<\/p>\n<p>In the meantime, whether you\u2019re setting ergonomic trends, sitting at your workstation tall and straight enough to make a drill sergeant proud, or in a repose of slothful decadence upon your favorite <em>chaise longue<\/em>, we\u2019re envisioning a world in which your interface accommodates you\u2014and not the other way around.<\/p>\n<p>So strike a pose and read our paper; we\u2019d love to hear what you think, where you stand, and how the idea of a posture-aware interface sits with you.<\/p>\n<p>However you come at the topic, make your momma proud\u2014and remember not to slouch!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The mobility of tablets affords interaction from a wide diversity of postures: Hunched over a desk with brow furrowed in concentration. On the go with the tablet gripped in one hand, while operating it with the other. Or kicked back on a couch to relax with some good old-fashioned Cat vs. Laser Pointer internet-video action. [&hellip;]<\/p>\n","protected":false},"author":38022,"featured_media":578191,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":[{"type":"user_nicename","value":"Ken Hinckley","user_id":"32521"}],"msr_hide_image_in_river":0,"footnotes":""},"categories":[194481],"tags":[],"research-area":[13554],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-577425","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-human-centered-computing","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[379814],"related-projects":[],"related-events":[577950],"related-researchers":[],"msr_type":"Post","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage.png\" class=\"img-object-cover\" alt=\"\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage.png 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-300x169.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-768x432.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-1024x576.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-1066x600.png 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-655x368.png 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2019\/04\/Posture-Aware-Paper_Social_03_2019_1400x788_StillImage-343x193.png 343w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","byline":"Ken Hinckley","formattedDate":"April 17, 2019","formattedExcerpt":"The mobility of tablets affords interaction from a wide diversity of postures: Hunched over a desk with brow furrowed in concentration. On the go with the tablet gripped in one hand, while operating it with the other. Or kicked back on a couch to relax&hellip;","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/577425","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/38022"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=577425"}],"version-history":[{"count":10,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/577425\/revisions"}],"predecessor-version":[{"id":578206,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/577425\/revisions\/578206"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/578191"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=577425"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=577425"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=577425"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=577425"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=577425"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=577425"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=577425"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=577425"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=577425"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=577425"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=577425"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}