hide
Free keywords:
-
Abstract:
The brain integrates multisensory information to create a coherent and more reliable perceptual estimate of the environment. This multisensory estimate is a linear combination of the individual unimodal estimates that are weighted by their relative reliabilities (e.g., Ernst and Banks, Nature, 2002).
Here we explored the neural substrates underlying visual-tactile integration in shape processing. To identify multisensory integration sites, we correlated behavioural data with neural activity evoked by multisensory integration.
Observers were presented with elliptical shapes that they could see and/or touch. Observers task was to judge the shape of the ellipse. Introducing conflicts between seen and felt shape allowed us to examine whether participants relied more on visual or tactile information (relative weight of vision and touch). To manipulate the weight attributed to vision, we degraded visual information.
We observed a decrease in visual weight when vision was degraded and thus became less reliable. Discrimination performance increased when both modalities were presented together, indicating that visual and tactile shape information is indeed fused.
BOLD response bilaterally in the anterior IPS is modulated by visual input. Change in BOLD signal these areas correlates with cue weights, suggesting that this activity reflects the relative weighting of vision and touch.