English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Effect of attention on multimodal cue integration

MPS-Authors
/persons/resource/persons83960

Helbig,  H
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  MO
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

EuroHaptics-2004-Helbig.pdf
(Any fulltext), 879KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Helbig, H., & Ernst, M. (2004). Effect of attention on multimodal cue integration. In M. Buss, & M. Fritschi (Eds.), 4th International Conference EuroHaptics 2004 (pp. 524-527). München, Germany: Institute of Automatic Control Engineering.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-D8E1-8
Abstract
Humans gather information about their environment from multiple sensory channels. It seems that cues from separate sensory modalities (e.g. vision and haptics) are combined
in a statistically optimal way according to a maximum-likelihood estimator [1]. Ernst and Banks showed that for bi-modal perceptual estimates, the weight attributed
to one sensory channel changes when its relative reliability is modified by increasing the noise associated to its signal. Because increasing the attentional load of a given sensory channel is likely to change its reliability, we assume that such modification would also alter the weight of the different cues for multimodal perceptual estimates. Here we examine this hypothesis using a dual-task paradigm. Subjects’ main-task is to estimate the size of a raised bar using vision alone, haptics alone, or both modalities combined. Their performance in the main-task condition alone is compared to the performance obtained when an additional visual ‘distractor’-task is performed simultaneously to the main-task (Dual-Task Paradigm). We found that vision-based estimates are more affected by a visual ‘distractor’ than the haptics-based estimates. Our findings substantiate that attention influences the weighting of the different sensory channels for multimodal perceptual estimates. That is, when attention is detracted from the visual modality, the haptic estimates are consequently weighted higher in visual-haptic size discrimination. In further experiments, we will examine the influence of a haptic ‘distractor’-task.
We would expect, that a haptic ‘distractor’ interferes to a higher extend with the haptic primary task. The vision-based estimates in the main-task should be less affected. We will then further examine whether cue integration is still statistically optimal.