Max Planck Institute for Empirical Aesthetics
Lecture by Dominik Endres: Models of human movement
perception and production
Virtual reality (VR) is a growing digital medium with many potential applications in psychological research and practice. Two necessary ingredients for an immersive VR experience are perceptually convincing appearance models for the objects and characters that populate the virtual environment, and equally convincing generative models of their behaviour. While the former have reached a level of maturity that allows their deployment in VR even by non-expert users, generating naturalistic behaviour still requires a large amount of hand-crafting and skill. In this talk, I will present our research on using probabilistic machine learning approaches to learn and control human movement models. Our approaches are inspired by two major theories about human movement production: optimal control and movement primitives. I will show that these models are able to predict, and sometimes surpass, human perceptual expectations about movement naturalness. Furthermore, these models can also stably control humanoid bodies, i.e. they are simultaneously models of production and perception, making them likely candidates for joint action-perception models in the spirit of the ‘common coding’ hypothesis. This property facilitates their use for the online reactive control of virtual reality avatars. Our future work will focus on adding a semantic component to our models, to make human avatar control by natural language-like instructions feasible