To recognize emotions using less obtrusive wearable sensors, we present a novel emotion recognition method that uses only pupil diameter (PD) and skin conductance (SC). Psychological studies show that these two signals are related to the attention level of humans exposed to visual stimuli. Based on this, we propose a feature extraction algorithm that extract correlation-based features for participants watching the same video clip. To boost performance given limited data, we implement a learning system without a deep architecture to classify arousal and valence. Our method outperforms not only state-of-art approaches, but also widely-used traditional and deep learning methods.

, , , ,
Xinhua Network
doi.org/10.1145/3340555.3353716
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Distributed and Interactive Systems

Zhang, T., El Ali, A., Wang, C., Zhu, X., & César Garcia, P. S. (2019). CorrFeat: Correlation-based feature extraction algorithm using skin conductance and pupil diameter for emotion recognition. In ICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction (pp. 404–408). doi:10.1145/3340555.3353716