Animating the Dead: Computational Necromancy with Reinforcement Learning
Anthropologists are often interested in questions such as “What do chimpanzees optimize for when they walk, and when they climb?”, “What would Lucy (Australopithecus Afarensis) look like when it walked?”, and “How did the modern human musculoskeletal system evolve from its evolutionary ancestors?”. All of these problems can be cast as learning a controller for a complex biomechanical system that optimizes some criteria. Reinforcement learning offers a set of techniques to address this problem formulation, if we can overcome the problems of learning in a high-dimensional, non-linear system, continuous in states, actions, and time.
In this talk, I will outline the types of biomechanics problems that anthropologists are interested in, and show how reinforcement learning can be used to address them. In particular, I will describe how Differential Dynamic Programming can be used to learn controllers for complex musculoskeletal simulations of humans, chimpanzees, and extinct hominins, and how these learned controllers can be used to answer fundamental questions in physical anthropology.
Bill Smart received his Ph.D. from Brown University where he worked on applying machine learning techniques to real robots. He is currently an assistant professor of computer science at Washington University in St. Louis. His research interests lie in the areas of machine learning, human-robot interaction, and brain-computer interfaces.
- Bill Smart
- Dept of CS & Engineering, Washington University in St. Louis