S³CIX 2026

Track B

Bayesian Modeling of Multisensory Integration in Interactive Movement

Joanna Emilia Bergström

on  We, 11:00 ! Livein  SAWB 423for  120min on  We, 14:00 ! Livein  SAWB 423for  30min

Abstract: Interactive movement tasks, such as reaching, pointing, and steering, depend on how users integrate sensory feedback (often visual and proprioceptive, sometimes also haptic or auditory) under uncertainty. This course introduces Bayesian models of multisensory integration to formalise how sensory signals are weighted over time in different interactive movement tasks. Building on these models, participants will explore a simple computational agent that simulates how sensory feedback contributes to updating of inferred position and velocity (e.g., of a hand or a cursor). We then explore how interaction techniques implicitly manipulate sensory feedback, and what consequences those may have for the interactive movements. Through the hands-on examples and exploration, we demonstrate how such models can be used to test hypotheses and predict performance in HCI tasks

 All speakers  Program