de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

The Time Scales of Information Representation in Auditory Cortex are Stimulus Dependent

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84006

Kayser,  C
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84966

Panzeri,  S
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Kayser, C., Logothetis, N., & Panzeri, S. (2010). The Time Scales of Information Representation in Auditory Cortex are Stimulus Dependent. Poster presented at AREADNE 2010: Research in Encoding And Decoding of Neural Ensembles, Santorini, Greece.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-BFF0-0
Abstract
Recent work has shown that in auditory cortex acoustic stimuli are potentially encoded by different neural codes, each operating on different temporal scales. For example, the millisecond- precise timing of individual neuron’s action potentials has been implicated similarly as firing rate modulations on slower scales or the timing of spikes to ongoing oscillatory background activity [1]. Here we asked whether the temporal precision of these putative neural codes is fixed and inherent to the system, or whether their temporal precision is determined by the acoustic stimulus. Stimulus information in different codes was compared during stimulation with naturalistic sounds and sequences of random tones. The natural sounds had a typical autocorrelation of around 20–30 ms (computed from the envelope of individual frequency bands), while random tones had a much shorter autocorrelation time (around 10 ms). Neural activity was recorded using multiple electrodes in primary and secondary auditory cortex of macaque monkeys passively listening to these stimuli. Mutual information between stimulus and neural activity was characterized using previously established approaches [2,3]. We found that the precise time scale of each code depends on the acoustic stimulus. For binary spike words (spike timing), the temporal precision required to decode maximal information was higher during stimulation with random tones (average 7 ms) than with natural sounds (average 12 ms). In addition, the degree to which field potentials were stimulus locked (‘entrained’) varied between sound types: during stimulation with random tones entrainment was stronger and extended to much higher frequencies (up to 60Hz) than during stimulation with natural sounds (about 30 Hz). These results extend previous finding in the visual thalamus and demonstrate that the temporal precision of sensory neurons responses in auditory cortex depends on the temporal structure of the stimulus. In particular, stimuli with shorter correlation times, hence faster intrinsic time scales, induce responses that vary on shorter time scales. This implies that the relevant time scales of neural codes are not fixed, but are dynamically adapted to, or reflect the environment.