Keyboard interaction patterns on a smartphone is the input for many intelligent emotion-aware applications, such as adaptive interface, optimized keyboard layout, automatic emoji recommendation in IM applications. The simplest approach, called the Experience Sampling Method (ESM), is to systematically gather self-reported emotion labels from users, which act as the ground truth labels, and build a supervised prediction model for emotion inference. However, as manual self-reporting is fatigue-inducing and attention-demanding, the self-report requests are to be scheduled at favorable moments to ensure high fidelity response. We, in this paper, perform fine-grain keyboard interaction analysis to determine suitable probing moments. Keyboard interaction patterns, both cadence, and latency between strokes, nicely translate to frequency and time domain analysis of the patterns. In this paper, we perform a 3-week in-the-wild study (N = 22) to log keyboard interaction patterns and self-report details indicating (in)opportune probing moments. Analysis of the dataset reveals that time-domain features (e.g., session length, session duration) and frequency-domain features (e.g., number of peak amplitudes, value of peak amplitude) vary significantly between opportune and inopportune probing moments. Driven by these analyses, we develop a generalized (all-user) Random Forest based model, which can identify the opportune probing moments with an average F-score of 93%. We also carry out the explainability analysis of the model using SHAP (SHapley Additive exPlanations), which reveals that the session length and peak amplitude have strongest influence to determine the probing moments.

, , , , , ,
26th International Conference on Intelligent User Interfaces: Where HCI Meets AI, IUI 2021
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Ghosh, S, Mandi, S, Mitra, B, & De, P. (2021). Exploring smartphone keyboard interactions for Experience Sampling Method driven probe generation. In International Conference on Intelligent User Interfaces (pp. 133–138). doi:10.1145/3397481.3450669