We develop the CEAP-360VR dataset to address the lack of continuously annotated behavioral and physiological datasets for 360 video VR affective computing. Accordingly, this dataset contains a) questionnaires (SSQ, IPQ, NASA-TLX); b) continuous valence-arousal annotations; c) head and eye movements as well as left and right eye pupil diameters while watching videos; d) peripheral physiological responses (ACC, EDA, SKT, BVP, HR, IBI). Our dataset also concludes the data pre-processing, data validating scripts, along with dataset description and key steps in the stage of data acquisition and pre-processing.

, , , , , , ,
doi.org/10.5281/zenodo.6143643
creativecommons.org/licenses/by/4.0/legalcode
Distributed and Interactive Systems

Xue, T., El Ali, A., Zhang, T., Ding, G., & César Garcia, P. S. (2021). CEAP-360VR: A continuous physiological and behavioral emotion annotation dataset for 360 VR Videos. doi:10.5281/zenodo.6143643