Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Visual influences on voice-selective neurons in the anterior superior-temporal plane

Perrodin, C., Kayser, C., Logothetis, N., & Petkov, C. (2009). Visual influences on voice-selective neurons in the anterior superior-temporal plane. Talk presented at 10th International Multisensory Research Forum (IMRF 2009). New York City, USA.

Item is

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Perrodin, C1, 2, Autor           
Kayser, C1, 2, Autor           
Logothetis, NK1, Autor           
Petkov, C1, Autor           
Affiliations:
1Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497798              
2Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497808              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: For social interaction and survival primates rely heavily on vocal and facial communication signals from their conspecifics. To date many studies have evaluated the unisensory representations of either vocal or facial information in regions thought to be “voiceâ€? or “faceâ€? selective. Other studies have directly evaluated the multisensory interactions of voices and faces but have focused on posterior auditory regions closer to the primary auditory cortex. This work investigates multisensory interactions at the neuronal level in an auditory region in the anterior superior temporal plane, which contains one of the important regions for processing “voiceâ€?-related information. Extracellular recordings were obtained from the auditory cortex of macaque monkeys, targeting an anterior “voiceâ€? region that we have previously described with functional magnetic resonance imaging (fMRI). For stimulation we used movies of vocalizing monkeys and humans which we matched in their low-level auditory and visual features. These dynamic face and voice stimuli allowed us to evaluate how neurons responded to auditory, visual or audio-visual components of the stimuli. Our experiments also contained control conditions consisting of several mismatched audiovisual stimuli combinations, such as 1) a voice matched to a face from a different species, 2) adding a temporal delay in the visual component of the stimulus, or 3) using an acoustically manipulated voice with the original facial stimulus. Our neuronal recordings identified a clustered population of voice-selective sites in the anterior superior temporal plane, ~5 mm anterior to field RT. A significant visual influence of the dynamic faces on the corresponding (“matchedâ€?) vocalizations was observed in both the local-field potential (LFP) and the spiking activity (analog multiunit activity, AMUA): 38 of the sites showed audiovisual interactions in the LFP signals, and 60 in the AMUA. In addition, the multisensory influence was significantly stronger for the matching voice and face stimuli than to any of the incongruent (“mismatchedâ€?) control conditions, confirming the specificity of the cross-sensory influence on the neuronal activity. Our results provide evidence for visual influences in what has been characterized as an auditory ‘voice’ area. This visual modulation was specific for behaviorally relevant voice-face associations and demonstrates that the processing of voice related information in higher auditory regions can be influenced by multisensory input.

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2009-06
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: 10th International Multisensory Research Forum (IMRF 2009)
Veranstaltungsort: New York City, USA
Start-/Enddatum: -

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle

einblenden: