Articulated Human Pose Tracking with Inertial Sensors

Date

August 24, 2017

Speaker

Xuesu Xiao 

Affiliation

Texas A&M University

Overview

Thus far, capturing human body motion has only been possible with precisely-positioned sensors and cameras in well-calibrated studio environments. This constraint has fundamentally restricted the application of motion-capture technology to low-no mobility scenarios. To precisely track motion, the state-of-the-art is to either use well-calibrated inertial or magnetic-position sensors tightly strapped to the body or high-speed cameras confined to a finite capture volume. In recent work, we aim to break this barrier around motion-capture technology and make it accessible outside of traditional studio environments. Thus, we enable new applications in VR-AR, health-care and sports training. In this talk, we will describe a new system with sensors integrated into everyday garments that can be used for articulated motion tracking in unconstrained and mobile settings. We will show how to quantify the displacement between sensors and body segments within a multi-rooted kinematic chain, and utilize deep-learning techniques for full-pose reconstruction. Our results are based on over 3 hours of data collected via 215 trials on 12 test subjects within a custom-studio room set up for this purpose.