日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

ポスター

Steady-state responses in MEG demonstrate information integration within but not across the auditory and visual senses

MPS-Authors
/persons/resource/persons83933

Giani,  A
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84016

Kleiner,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Giani, A., Erick, O., Belardinelli, P., Kleiner, M., Preissl, H., & Noppeney, U. (2011). Steady-state responses in MEG demonstrate information integration within but not across the auditory and visual senses. Poster presented at 12th Conference of Junior Neuroscientists of Tübingen (NeNA 2011), Heiligkreuztal, Germany.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-B9E6-7
要旨
To form a unified percept of our environment, the human brain integrates information within and across the senses. This MEG study investigated interactions within and between sensory modalities using a frequency analysis of steady-Ââstate responses (SSR) to periodic auditory and/or visual inputs. The 3x3 factorial design, manipulated (1) modality (auditory only, visual only and audiovisual) and (2) temporal dynamics (static, dynamic1 and dynamic2). In the static conditions, subjects were presented with (1) visual gratings, luminance modulated at 6Hz and/or (2) pure tones, frequency modulated at 40 Hz. To manipulate perceptual synchrony, we imposed additional slow modulations on the auditory and visual stimuli either at same (0.2 Hz = synchronous) or different frequencies (0.2 Hz vs. 0.7 Hz = asynchronous). This also enabled us to investigate the integration of two dynamic features within one sensory modality (e.g. a pure tone frequency modulated at 40Hz amplitude modulated at 0.2Hz) in the dynamic conditions. We reliably identified crossmodulation frequencies when these two stimulus features were modulated at different frequencies. In contrast, no crossmodulation frequencies were identified when information needed to be combined from auditory and visual modalities. The absence of audiovisual crossmodulation frequencies suggests that the previously reported audiovisual interactions in primary sensory areas may mediate low level spatiotemporal coincidence detection that is prominent for stimulus transients but less relevant for sustained SSR responses. In conclusion, our results indicate that information in SSRs is integrated over multiple time scales within but not across sensory modalities at the primary cortical level.