English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Leading or Lagging: Temporal prediction errors are expressed in auditory and visual cortices

MPS-Authors
/persons/resource/persons84042

Lee,  HL
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppemey,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Lee, H., & Noppemey, U. (2012). Leading or Lagging: Temporal prediction errors are expressed in auditory and visual cortices. Poster presented at 18th Annual Meeting of the Organization for Human Brain Mapping (OHBM 2012), Beijing, China.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-B738-3
Abstract
Introduction: In our natural environment our brain is exposed to a constant influx of multisensory signals that dynamically evolve at multiple timescales. Statistical regularities are important cues informing the brain whether two sensory signals are generated by a common physical process and should hence be integrated. This fMRI study investigated how the brain detects violations of these statistical regularities induced by the temporal misalignment of the visual and auditory signals. Specifically, we arbitrated between two hypotheses that make opposite predictions: Under the predictive coding framework the brain iteratively optimizes an internal model of its multisensory environment by reducing the error between its predictions and the sensory inputs. An audiovisual misalignment that violates the natural statistical regularities should thus induce a prediction error signal. For visual leading asynchrony, we would expect a prediction error signal in the auditory cortex, because the delayed auditory signal violates the temporal predictions of the 'leading' visual system (vice versa for auditory leading asynchrony) [2,3]. Alternatively, from the perspective of the biased competition model, the misaligned auditory and visual signals compete for processing resources. For visual leading asynchrony, we would expect an increased BOLD-signal in the visual system indexing the higher salience of the leading visual signal which then suppresses the temporally incompatible auditory signal [1]. Methods: 37 subjects participated in this fMRI study (Siemens TimTrio 3T scanner, GE-EPI, TE = 40 ms, 42 axial slices, TR = 3s). They passively perceived audiovisual movies of natural speech, sinewave speech (SWS) and piano music. The audiovisual signals were synchronous, auditory leading (+240ms) or visual leading (-240ms). Hence, the 3 x 3 factorial design manipulated (i) temporal alignment (3 levels) and (ii) stimulus class (3 levels). The activation trials were interleaved with 8s fixation blocks. To allow for random-effects analyses, contrast images (single condition > fixation) for each subject were entered into a 2nd level ANOVA, which modelled the 9 effects in our 3 X 3 design. 1. Using a conjunction null conjunction analysis, we identified differences between auditory and visual leading conditions that are common to speech, SWS and music. 2. We tested for asynchrony effects (i.e. auditory leading > synchronous, visual leading > synchronous) separately for each stimulus class. Results are reported at p<.05 corrected for multiple comparisons at the cluster level using a height threshold of p<.001 uncorrected. Results: 1. Common for all stimulus classes, auditory leading relative to visual leading signals increased activations in bilateral V5/hMT+. In contrast, visual leading relative to auditory leading signals increased activations in bilateral Heschl's gyri (Fig. 1). 2. Auditory leading relative to synchronous AV signals increased activations in the auditory system extending from Heschl's gyrus into posterior superior temporal sulcus/gyrus (STS/STG) bilaterally. Conversely, visual leading relative to synchronous signals increased activations in bilateral occipito-temporal cortices predominantly in V5/hMT+ (Fig. 2).