de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

From Independence to Fusion: A Comprehensive Model for Sensory Integration

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Ernst, M. (2005). From Independence to Fusion: A Comprehensive Model for Sensory Integration. Poster presented at Fifth Annual Meeting of the Vision Sciences Society (VSS 2005), Sarasota, FL, USA.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D45D-7
Abstract
Recently we demonstrated that humans integrate visual and haptic information in a statistically optimal way (Ernst Banks, 2002). I.e., subjects make optimal use of the information provided in order to reach the decision necessary for the task. As shown by Hillis et al. (2002), however, this does not necessarily imply that the sensory signals are completely fused into a unified percept. If subjects would completely fuse the signals, by definition, they would not at all retain access to the incoming sources of information. In contrast, Hillis et al. found some weaker form of interaction between the sensory signals. The degree of interaction between the sensory signals can be taken as definition for the strength of coupling between the signals: there is no coupling if the signals are independent; there is maximal coupling if the signals are fused. Using Bayesian decision theory I here propose a comprehensive model that can account for both results. The prior used in this model represents the probability of the physical relationship (mapping) between the signals derived by the sensory systems. This probability has a narrow tuning if the mapping between the physical signals is relatively constant (such as e.g., the mapping between texture and disparity signals). If the mapping changes easily (such as e.g. the mapping between visual and haptic signals), the distribution of possible mappings reflected in such a prior is wider. This is called the “coupling prior” because the tuning of the prior will determine the level of interaction, i.e., the strength of coupling. I will further present data from a visual-haptic discrimination experiment that will support these theoretical considerations. Taken together, I propose that such a Bayesian model that uses a “Coupling Prior” for describing sensory interactions is a convenient theoretical framework for understanding multisensory integration as a continuous process between independence and complete fusion.