Inferring emotions from Head Movement (HM) and Eye Movement (EM) data in 360◦ Virtual Reality (VR) can enable a low-cost means of improving users' Quality of Experience. Correlations have been shown between retrospective emotions and HM, as well as EM when tested with static 360◦ images. In this early work, we investigate the relationship between momentary emotion self-reports and HM/EM in HMD-based 360◦ VR video watching. We draw on HM/EM data from a controlled study (N=32) where participants watched eight 1-minute 360◦ emotion-inducing video clips, and annotated their valence and arousal levels continuously in real-time. We analyzed HM/EM features across fine-grained emotion labels from video segments with varying lengths (5-60s), and found significant correlations between HM rotation data, as well as some EM features, with valence and arousal ratings. We show that fine-grained emotion labels provide greater insight into how HM/EM relate to emotions during HMD-based 360◦ VR video watching.

, , , ,
doi.org/10.1145/3411763.3451627
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Xue, T., El Ali, A., Ding, G., & César Garcia, P. S. (2021). Investigating the relationship between momentary emotion self-reports and Head and Eye Movements in HMD-based 360 VR video watching. In CHI EA: Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (pp. 1–8). doi:10.1145/3411763.3451627