During group interactions, we react and modulate our emotions and behaviour to the group through phenomena including emotion contagion and physiological synchrony. Previous work on emotion recognition through video/image has shown that group context information improves the classification performance. However, when using physiological data, literature mostly focuses on intrapersonal models that leave-out group information, while interpersonal models are unexplored. This paper introduces a new interpersonal Weighted Group Synchrony approach, which relies on Electrodermal Activity (EDA) and Heart-Rate Variability (HRV). We perform an analysis of synchrony metrics applied across diverse data representations (EDA and HRV morphology and features, recurrence plot, spectrogram), to identify which metrics and modalities better characterise physiological synchrony for emotion recognition. We explored two datasets (AMIGOS and K-EmoCon), covering different group sizes (4 vs dyad) and group-based activities (video-watching vs conversation). The experimental results show that integrating group information improves arousal and valence classification, across all datasets, with the exception of K-EmoCon on valence. The proposed method was able to attain mean M-F1 of <inline-formula><tex-math notation="LaTeX">$\approx$</tex-math></inline-formula> 72.15&#x0025; arousal and 81.16&#x0025; valence for AMIGOS, and M-F1 of <inline-formula><tex-math notation="LaTeX">$\approx$</tex-math></inline-formula> 52.63&#x0025; arousal, 65.09&#x0025; valence for K-EmoCon, surpassing previous work results for K-EmoCon on arousal, and providing a new baseline on AMIGOS for long-videos.

, , , , , , , , , , ,
IEEE Transactions on Affective Computing
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Bota, P, Zhang, T, El Ali, A, Fred, A, Silva, H.P.D, & César Garcia, P.S. (2023). Group synchrony for emotion recognition using physiological signals. IEEE Transactions on Affective Computing, 1–12. doi:10.1109/TAFFC.2023.3265433