Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Vision and touch are automatically integrated for the perception of sequences of events

Ernst, M., Bresciani, J.-P., & Dammeier, F. (2006). Vision and touch are automatically integrated for the perception of sequences of events. Talk presented at 7th International Multisensory Research Forum (IMRF 2006). Dublin, Ireland.

Item is

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Ernst, MO1, Autor           
Bresciani, J-P2, Autor           
Dammeier, F1, 2, Autor           
Affiliations:
1Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497806              
2Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: The purpose of the present experiment was to investigate the integration of sequences of visual and tactile events. Participants were presented with sequences of visual flashes and tactile taps simultaneously and instructed to count either the flashes (session 1) or the taps (session 2). The number of flashes could differ from the number of taps by ±1. For both sessions, the perceived number of events was significantly influenced by the number of events presented in the task-irrelevant modality. Touch had a stronger influence on vision than vision on touch. Interestingly, touch was the more reliable of the two modalities – less variable estimates when presented alone. For both sessions, the perceptual estimates were less variable when stimuli were presented in both modalities than when the task-relevant modality was presented alone. These results indicate that even when one signal is explicitly task-irrelevant, sensory information tends to be automatically integrated across modalities. They also suggest that the relative weight of each sensory channel in the integration process depends on its relative reliability. The results are described using a Bayesian probabilistic model for multimodal integration that accounts for the coupling between the sensory estimates.

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2006-06
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: URI: http://imrf.mcmaster.ca/IMRF/2006/viewabstract.php?id=191symposium=0
BibTex Citekey: ErnstBD2006
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: 7th International Multisensory Research Forum (IMRF 2006)
Veranstaltungsort: Dublin, Ireland
Start-/Enddatum: -

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle

einblenden: