This demo showcases a real-time visualisation displaying the level of engagement of a group of people attending a Jazz concert. Based on wearable sensor technology and machine learning principles, we present how this visualisation for enhancing events was developed following a user-centric approach. We describe the process of running an experiment using our custom physiological sensor platform, gathering requirements for the visualisation and finally implementing said visualisation. The end result being a collaborative artwork to enhance people's immersion into cultural events.

, , , , ,
doi.org/10.1145/3123266.3127919
ACM International Conference on Multimedia
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Röggla, T., Shirzadian, N., Zheng, Z., Panza, A., & César Garcia, P. S. (2017). Enhancing music events using physiological sensor data. In MM 2017 - Proceedings of the 2017 ACM Multimedia Conference (pp. 1239–1240). doi:10.1145/3123266.3127919