Sharing breathing signals has the capacity to provide insights into hidden experiences and enhance interpersonal communication. However, it remains unclear how the modality of breath signals (visual, haptic) is socially interpreted during collaborative tasks. In this mixed-methods study, we design and evaluate BreatheWithMe, a prototype for real-time sharing and receiving of breathing signals through visual, vibrotactile, or visual-vibrotactile modalities. In a within-subjects study (15 pairs), we investigated the effects of modality on breathing synchrony, social presence, and overall user experience. Key findings showed: (a) there were no significant effects of visualization modality on breathing synchrony, only on deliberate music-driven synchronization; (b) visual modality was preferred over vibrotactile feedback, despite no differences across social presence dimensions; (c) BreatheWithMe was perceived to be an insightful window into others, however included data exposure and social acceptability concerns. We contribute insights into the design of multi-modal real-time breathing visualization systems for colocated, collaborative tasks.
CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
Distributed and Interactive Systems

El Ali, A., Stepanova, E., Palande, S., Mader, A., César Garcia, P. S., & Jansen, K. (2023). BreatheWithMe: Exploring visual and vibrotactile displays for social breath awareness during colocated, collaborative tasks. In CHI EA: Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (pp. 1–8). doi:10.1145/3544549.3585589