Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Automatic integration of visual, tactile and auditory signals for the perception of sequences of events

Bresciani, J.-P., Dammeier, F., & Ernst, M. (2006). Automatic integration of visual, tactile and auditory signals for the perception of sequences of events. Poster presented at 29th European Conference on Visual Perception, St. Petersburg, Russia.

Item is

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Bresciani, J-P1, Autor           
Dammeier, F1, 2, Autor           
Ernst, MO2, Autor           
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497806              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: Sequences of visual flashes, tactile taps, and auditory beeps were presented simultaneously. For each session, subjects were instructed to count the number of events presented in one modality (focal modality) and to ignore the other modalities (background). The number of events presented in the background modality(ies) could differ from the number of events in the focal modality. The experiment consisted of nine different sessions, all nine combinations between visual, tactile, and auditory signals being tested. In each session, the perceived number of events in the focal modality was significantly influenced by the background signal(s). The visual modality, which had the largest intrinsic variance (focal modality presented alone), was the most susceptible to background-evoked bias and the less efficient in biasing the other two modalities. Conversely, the auditory modality, which had the smallest intrinsic variance, was the less susceptible to background-evoked bias and the most efficient in biasing the othe r two modalities. These results show that visual, tactile, and auditory sensory signals tend to be automatically integrated for the perception of sequences of events. They also suggest that the relative weight of each sensory signal in the integration process depends on its intrinsic relative reliability.

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2006-08
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: URI: http://www.perceptionweb.com/abstract.cgi?id=v060466
BibTex Citekey: 4180
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: 29th European Conference on Visual Perception
Veranstaltungsort: St. Petersburg, Russia
Start-/Enddatum: -

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle

einblenden: