[ Back ]

Title

Computational Understanding of the interaction between Facial Expressions and Electroencephalogram (EEG)Signals for Emotion Recognition Affective Computing Machine Learning

Author Soheil RAYATDOOST
Director of thesis Dr. David Rudrauf
Co-director of thesis Prof. Bastien Chopard , Dr. Mohammad Soleymani
Summary of thesis

Existing research demonstrates the potential of EEG-based emotion recognition. Consistent changes during emotional episodes in EEG signals can be attributed to affective activities from the brain, sensory activity related to the stimuli and interference from behavioral activities in facial muscles and eyes. This work aims at better understanding the extent to which EEG-based emotion recognition relies on brain activities. We collected a multimodal database including both spontaneous and posed expressions. The correlations between different modalities show the presence of both behavioral and neural responses are in EEG signals.

We show that a classifier trained on the spontaneous emotions can learn posed expressions, even for participants reporting no felt emotions during expressions. We designed a neural network with a gradient reversal layer to train an encoder invariant to behavioral activities.

The results from learned representations demonstrated that after unlearning behavior-related information, emotion recognition performance was very close to the chance level. The sensory responses to stimuli also include significant emotion specific activities. Our analyses demonstrate that behavioral and sensory activities are likely the leading features in EEG-based emotion recognition. These findings question the contribution of affective brain activities to EEG-based emotion recognition and warrant further analysis and discussion in this direction.

Status finishing
Administrative delay for the defence 2020
URL
LinkedIn
Facebook
Twitter
Xing