Enhancing music events using physiological sensor data
This demo showcases a real-time visualisation displaying the level of engagement of a group of people attending a Jazz concert. Based on wearable sensor technology and machine learning principles, we present how this visualisation for enhancing events was developed following a user-centric approach. We describe the process of running an experiment using our custom physiological sensor platform, gathering requirements for the visualisation and finally implementing said visualisation. The end result being a collaborative artwork to enhance people's immersion into cultural events.
|Keywords||Cultural experiences, Data visualisation, GSR, Interactive art, Sensors, Shared experiences|
|Conference||ACM International Conference on Multimedia|
Röggla, T, Shirzadian, N, Zheng, Z, Panza, A, & Cesar Garcia, P.S. (2017). Enhancing music events using physiological sensor data. In MM 2017 - Proceedings of the 2017 ACM Multimedia Conference (pp. 1239–1240). doi:10.1145/3123266.3127919