Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Natural, metaphoric and linguistic auditory-visual interactions

MPG-Autoren
/persons/resource/persons84182

Sadaghiani,  S
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84069

Maier,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Sadaghiani, S., Maier, J., & Noppeney, U. (2008). Natural, metaphoric and linguistic auditory-visual interactions. Poster presented at 9th International Multisensory Research Forum (IMRF 2008), Hamburg, Germany.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-C873-1
Zusammenfassung
To form a coherent percept of our dynamic environment, the brain merges motion information from the auditory and visual senses. Yet, not only auditory motion, but also ‘metaphoric’ pitch has been shown to influence visual motion discrimination. Here, we systematically investigate the neural systems that mediate auditory influences on visual motion discrimination in natural, metaphoric and linguistic contexts. In a visual selective attention paradigm, subjects discriminated the direction of visual motion at several levels of ambiguity, while ignoring a simultaneous auditory stimulus that was 1) ‘natural’ MOTION: left vs. right moving white noise, 2) ‘metaphoric’ PITCH: rising vs. falling pitch or 3) ‘linguistic’ SPEECH: spoken German words denoting directions e.g. ‘links’ vs. ‘rechts’. Behaviourally, all three classes of auditory stimuli induced a comparable directional bias. At the neural level, the interaction between visual ambiguity and audition revealed an auditory influence on visual motion processing for MOTION in left hMT/V5 and for SPEECH in right intraparietal sulcus. Direct comparisons across contexts confirmed this functional dissociation: The interaction effect gradually decreased in left hMT+/V5 for MOTION>PITCH>SPEECH and in right IPS for SPEECH>PITCH>MOTION. In conclusion, while natural audio-visual integration of motion signals emerges in motion processing areas, linguistic interactions are revealed primarily in higher level fronto-parietal regions.