Humans perceive the world through different perceptual modalities, which are processed in the brain by modality-specific areas and structures. However, there also exist multimodal neurons and areas, specialized in integrating perceptual information to enhance or suppress brain response. The particular way the human brain fuses crossmodal (or multimodal) perceptual information manifests itself first in behavioural studies. These crossmodal interactions are widely explored in some modalities, especially for auditory and visual input, and less explored for other modalities, like taste and olfaction, yet it is known that these effects can occur with any two modalities. The integration of sensory data is an important research area in computer science, and stands to benefit from the studies into the brain function; many biological processes serve as models for computer algorithms. On the other hand, computer models of sensor integration are built on mathematical principles, and provide normative insights into the functioning of the brain. This paper surveys the psychological and neurological findings pertaining to human multi-sensor fusion, followed by a brief review of the relevant computer science terminology and modeling approaches. The potential of an interdisciplinary approach to information fusion encompassing neuroscience, psychology and computer science have recently been recognized, and a multidisciplinary workshop on biologically inspired information fusion was organized to bring researchers together to determine a common agenda. The conclusion summarizes the agenda of research outlined at the workshop and attempts to raise research questions for the future.
, , ,
Pharmapublication Planning
O. Tanrıdağ
Biometric Sensing and Authentication
Signals and Images

Salah, A. A. (2008). Perceptual Fusion in Humans and Machines. In O. Tanrıdağ (Ed.), Cognitive Neuroscience at Marmaris - An Interdisciplinary Book on Selected Themes From The Previous Meetings 2005-2007 (pp. 71–88). Pharmapublication Planning.