de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Effect of attention on multimodal cue integration

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83960

Helbig,  H
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Helbig, H., & Ernst, M. (2004). Effect of attention on multimodal cue integration. In 4th International Conference EuroHaptics 2004 (pp. 524-527). München, Germany: Institute of Automatic Control Engineering.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-D8E1-8
Zusammenfassung
Humans gather information about their environment from multiple sensory channels. It seems that cues from separate sensory modalities (e.g. vision and haptics) are combined in a statistically optimal way according to a maximum-likelihood estimator [1]. Ernst and Banks showed that for bi-modal perceptual estimates, the weight attributed to one sensory channel changes when its relative reliability is modified by increasing the noise associated to its signal. Because increasing the attentional load of a given sensory channel is likely to change its reliability, we assume that such modification would also alter the weight of the different cues for multimodal perceptual estimates. Here we examine this hypothesis using a dual-task paradigm. Subjects’ main-task is to estimate the size of a raised bar using vision alone, haptics alone, or both modalities combined. Their performance in the main-task condition alone is compared to the performance obtained when an additional visual ‘distractor’-task is performed simultaneously to the main-task (Dual-Task Paradigm). We found that vision-based estimates are more affected by a visual ‘distractor’ than the haptics-based estimates. Our findings substantiate that attention influences the weighting of the different sensory channels for multimodal perceptual estimates. That is, when attention is detracted from the visual modality, the haptic estimates are consequently weighted higher in visual-haptic size discrimination. In further experiments, we will examine the influence of a haptic ‘distractor’-task. We would expect, that a haptic ‘distractor’ interferes to a higher extend with the haptic primary task. The vision-based estimates in the main-task should be less affected. We will then further examine whether cue integration is still statistically optimal.