Delayed opening of registration
Registration will not open on February 1st as previously advertised, due to an issue with payment processing.
As soon as this problem has been resolved the registration link will become available. Please check back soon!
Student applications are open
See Registration for details
Bayesian Modeling of Multisensory Integration in Interactive Movement
Abstract: Interactive movement tasks, such as reaching, pointing, and steering, depend on how users integrate sensory feedback (often visual and proprioceptive, sometimes also haptic or auditory) under uncertainty. This course introduces Bayesian models of multisensory integration to formalise how sensory signals are weighted over time in different interactive movement tasks. Building on these models, participants will explore a simple computational agent that simulates how sensory feedback contributes to updating of inferred position and velocity (e.g., of a hand or a cursor). We then explore how interaction techniques implicitly manipulate sensory feedback, and what consequences those may have for the interactive movements. Through the hands-on examples and exploration, we demonstrate how such models can be used to test hypotheses and predict performance in HCI tasks