de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Multisensory processing of looming signals in primates

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84069

Maier,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83932

Ghazanfar,  A
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Maier, J., Logothetis, N., & Ghazanfar, A. (2005). Multisensory processing of looming signals in primates. Poster presented at 6th International Multisensory Research Forum (IMRF 2005), Trento, Italy.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-D56F-4
Zusammenfassung
The world is full of rapidly approaching danger. In order to survive in such a dynamic and dangerous environment, one must perceive and respond appropriately such events. Looming signals are those sensory cues that indicate the rapid approach of objects. Many animal species possess behavioral biases toward visual and auditory looming signals. However, the ability to integrate looming signals across modalities has not been directly studied and is the subject of the presented work. First, using a preferential looking paradigm, we found that rhesus monkeys naturally integrate auditory-visual looming signals, using simple motion-in-depth cues (dynamic intensity change and visual expansion/contraction). Second, in a psychophysical study in humans, we found that humans also spontaneously integrate auditory-visual motion-in-depth signals. Finally, to investigate the neural correlates of this integration, we recorded local field potential (LFP) activity in the monkey temporal lobe while the subject was presented with auditory, visual and bimodal looming and receding signals. Preliminary analysis of LFP signals shows multisensory effects in auditory cortex, and increased coherence between simultaneously recorded LFP signals in auditory cortex and STS during bimodal stimulation. These results might suggest that the brain integrates information across modalities by synchronizing activity from different sensory areas.