de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Integration of shape information from vision and touch: Optimal perception and neural correlates

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83960

Helbig,  HB
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Helbig, H., Ricciardi E, Pietrini, P., & Ernst, M. (2006). Integration of shape information from vision and touch: Optimal perception and neural correlates. Poster presented at 9th Tübingen Perception Conference (TWK 2006), Tübingen, Germany.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D2A3-8
Abstract
Recently, Ernst and Banks (2002) showed that visual-haptic size information is integrated in a statistically optimal manner, i.e. visual and haptic size estimates are weighted according to their reliabilities. Here we investigate whether the same is true for visual-haptic shape information. We further explored the neural substrates underlying visual-haptic integration in shape processing using fMRI and examined whether neural activity elicited by multisensory integration correlates with cue weighting. For this we used ridges of elliptical objects that subjects could see and/or feel. Subjects saw the front of the object and they felt the back. The elongation of the elliptical ridges on both sides of the objects could differ and subjects’ task was to decide whether the ellipse was elongated vertically or horizontally. This way we could study the weight of vision and touch during shape discrimination. We varied the weight given to vision by degrading the visual information, using blur. The psychophysical experiments showed that visual and haptic shape information is integrated in a statistical optimal way even when the visual information is displayed via a mirror. That is, we observed a decrease in visual weight when vision was degraded and thus less reliable. Furthermore, we found an increase in discrimination performance when both modalities were presented together. These results were crucial since the fMRI experiments relied on presenting objects in a mirror. We also determined neural activity with fMRI while individuals were performing the same ellipse discrimination task. When the visual reliability is reduced in the visual-haptic task, neural responses decreased in the lateral occipital cortex while increased in the anterior intraparietal cortex, a brain region strongly involved in multisensory integration.