In previous work, we proposed a unified approach for describing multimodal human-computer interaction and interaction constraints in terms of sensual, motor, perceptual and cognitive functions of users. In this paper, we extend this work by providing formalised vocabularies that express human functionalities and anatomical structures required by specific modalities. The central theme of our approach is to connect these modality representations with descriptions of user, device and environmental constraints that influence the interaction. These descriptions can then be used in a reasoning framework that will exploit formal connections among interaction modalities and constraints. The focus of this paper is on specifying a comprehensive vocabulary of necessary concepts. Within the context of an interaction framework, we describe a number of examples that use this formalised knowledge.
CTIT Workshop Proceedings WP 07-01
I.F. van der Sluis , M. Theune , E. Reiter , E.J. Krahmer (Emiel)
NL-Passepartout
Workshop on Multimodal Output Generation
Human-Centered Data Analytics

Obrenovic, Z., Troncy, R., & Hardman, L. (2007). Vocabularies for Description of Accessibility Issues in Multimodal User Interfaces. In I. F. van der Sluis, M. Theune, E. Reiter, & E. Krahmer (Eds.), MOG 2007- CTIT Proceedings of the Workshop on Multimodal Output Generation (pp. 117–128). CTIT Workshop Proceedings WP 07-01.