With the increasing availability of head-mounted displays (HMDs) that show immersive 360° VR content, it is important to understand to what extent these immersive experiences can evoke emotions. Typically to collect emotion ground truth labels, users rate videos through post-experience self-reports that are discrete in nature. However, post-stimuli self-reports are temporally imprecise, especially after watching 360° videos. In this work, we design six continuous emotion annotation techniques for the Oculus Rift HMD aimed at minimizing workload and distraction. Based on a co-design session with six experts, we contribute HaloLight and DotSize, two continuous annotation methods deemed unobtrusive and easy to understand. We discuss the next challenges for evaluating the usability of these techniques, and reliability of continuous annotations.

, , ,
doi.org/10.1145/3334480.3382895
SIGCHI Conference on Human Factors in Computing Systems
Distributed and Interactive Systems

Xue, T., Ghosh, S., Ding, G., El Ali, A., & César Garcia, P. S. (2020). Designing real-time, continuous emotion annotation techniques for 360° VR videos. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–9). doi:10.1145/3334480.3382895