hide
Free keywords:
-
Abstract:
To respond more quickly to events in natural environments the human brain merges information from multiple senses into a more reliable percept. Multisensory integration processes have been demonstrated in a distributed neural system encompassing sensory-specific, higher association and prefrontal cortices. Using fMRI and psychophysical methods this dissertation investigates the functional similarities, differences and constraints that govern the integration of auditory and visual information in different regions of the human cerebral cortex. Characterizing their temporal response codes, effective connectivity patterns and underlying computations for combining multisensory inputs, this work provides evidence for the integration of specific types of information at 3 functionally specialized processing stages. At the first stage, multisensory interactions in sensory-specific regions indicate the same sensory source by integrating spatiotemporally aligned auditory and visual inputs to enhance stimulus detection. At the second stage, multisensory interactions in higher association regions integrate complex environmental features into higher order representations, forming a unified percept and mediating multisensory benefits in object recognition. At the third stage, multisensory interactions in the prefrontal cortex mediate response selection processes based on perceptual information from auditory and visual modalities with multisensory facilitations of reaction time. This dissertation constitutes the first systematic attempt to dissociate the contributions of sensory-specific, higher association and prefrontal areas to audiovisual integration in the human brain.