There are different modes of interaction with a software keyboard on a smartphone, such as typing and swyping. Patterns of such touch interactions on a keyboard may reflect emotions of a user. Since users may switch between different touch modalities while using a keyboard, therefore, automatic detection of emotion from touch patterns must consider both modalities in combination to detect the pattern. In this paper, we focus on identifying different features of touch interactions with a smartphone keyboard that lead to a personalized model for inferring user emotion. Since distinguishing typing and swyping activity is important to record the correct features, we designed a technique to correctly identify the modality. The ground truth labels for user emotion are collected directly from the user by periodically collecting self-reports. We jointly model typing and swyping features and correlate them with user provided self-reports to build a personalized machine learning model, which detects four emotion states (happy, sad, stressed, relaxed). We combine these design choices into an Android application TouchSense and evaluate the same in a 3-week in-the-wild study involving 22 participants. Our key evaluation results and post-study participant assessment demonstrate that it is possible to predict these emotion states with an average accuracy (AUCROC) of 73% (std dev. 6%, maximum 87%) combining these two touch interactions only.

, , , ,
doi.org/10.1016/j.ijhcs.2019.04.005
International Journal of Human-Computer Studies
Distributed and Interactive Systems

Ghosh, S., Hiware, K., Ganguly, N., Mitra, B., & De, P. (2019). Emotion detection from touch interactions during text entry on smartphones. International Journal of Human-Computer Studies, 130, 47–57. doi:10.1016/j.ijhcs.2019.04.005