Last updated: 19 July 2024
We have a 2d projector-based setup as well as Oculus Rift S HMD based setup along with Ultra leap motion sensors for hand tracking and a Kinect sensor for face tracking. Currently the lab is running experiments on action and agency, information integration across multiple modalities in action planning and execution and experiments on dynamic facial expression recognition.