de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Auditory Processing under Steady State Visual Driving

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84444

Tsiatsis,  P
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Tsiatsis, P., & Noppeney, U. (2011). Auditory Processing under Steady State Visual Driving. Poster presented at 12th Conference of Junior Neuroscientists of Tübingen (NeNA 2011), Heiligkreuztal, Germany.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-B9B0-0
Abstract
Neuronal oscillations are considered crucial for information processing in the brain as they can potentially regulate information flow and dynamically bind different cortical and non-cortical regions. This MEG study investigated whether the effect of a transient sound was modulated by the phase of oscillations in the visual cortex. To induce steady state oscillations in the visual cortex, we presented subjects with continuous visual signals luminance-modulated at 4Hz or 10Hz. The transient sounds were presented locked to four phases of the periodic visual stimulus (i.e. 0, 1 2, , 3 4). We then investigated whether the effect of sound depends on the phase of the visual steady state activity by testing for the interaction between sound and visual phase. Conversely, we will investigate the effect of the sound processing on the visual steady state processing given the state of the visual cortex. The results from the two experiments (4Hz 10Hz) will be combined and compared. Based on recent neurophysiological evidence, we hypothesize that oscillations at different frequencies play distinct functional roles in multisensory integration.