The Affective Computing and Multimodal Interaction group studies various forms of interaction, in addition to the classical screen-based visual mode. Some considered interaction modalities are haptic, auditory, and based on physiological signals.
User emotional assessment
Assessment of users emotional states by using multimodal physiological signals
Movie affective characterization using physiological signals and content analysis
Emotion Awareness Tools for Mediated Interaction (EATMINT)
Interaction for visually impaired and blind people
Modality conversion and visual data sonification
A smart walker for senior citizens
Interaction between the user and the computer, using only electrical brain information (i.e. without using the motor system, unlike keyboard or mouse).
3D brain activity reconstruction
Stochastic approaches for determining active brain areas, based on direct and inverse solutions.
Prof. Thierry Pun (Professor)
Dr. Guido Bologna (Associate Researcher)
Dr. Guillaume Chanel (Senior Researcher and Teaching Assistant)
Dr. Mohammad Soleymani (Senior Researcher)
Dr. Theodoros Kostoulas (Postdoctoral Researcher and Lecturer)
Dr. Anna Aljanaki (Postdoctoral Researcher)
Dr. Phil Lopes (Postdoctoral Researcher)
Sunny Avry (Research Assistant)
Michal Muszynski (Research Assistant)
Soheil Rayatdoost (Research Assistant)
Chen Wang (Research Assistant)
Dr. Patrick Roth (Associate Researcher)
The list of MMI team publications