English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Temporal calibration between the visual, auditory and tactile senses: A psychophysical approach

MPS-Authors
/persons/resource/persons84065

Machulla,  T
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83885

Di Luca,  M
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  M
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Machulla, T., Di Luca, M., & Ernst, M. (2007). Temporal calibration between the visual, auditory and tactile senses: A psychophysical approach. Poster presented at 1st Peach Summer School, Santorini, Greece.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-CD1B-1
Abstract
Human observers acquire information about physical properties of the environment through different sensory
modalities. For natural events, these sensory signals show a specific temporal, spatial and contextual configuration that
aids the integration into a coherent multisensory percept. For multimodal virtual environments, however, signals have
to be created and displayed separately for different modalities, which may result in a miscalibration of these signals.
This, in turn, can greatly reduce the observer’s sense of immersion and presence.
Using psychophysical methods, we investigate fundamental questions regarding how the temporal alignment of signals
from the visual, auditory and tactile modalities is achieved. A first project examines the perception of subjective
simultaneity of signals. Simultaneity detection poses a non-trivial matching problem to the human brain: physical and
neural transmission times differ greatly between the senses. As there is only partial compensation for these differential
delays, subjective simultaneity may result from presenting stimuli with a physical delay. Here, we are interested in
whether this phenomenon reflects an amodal timing mechanism that works across all modalities in a uniform fashion.
Further, we look at the sensitivity for asynchrony detection for different modality pairs as well as at interindividual
differences.
In a second project, we examine the ability of the human cognitive system to adapt to asynchronous information in
different modalities. Adaptation may be used to reduce the disruptive effects of temporal miscalibration between
signals in different modalities. We are interested in the strength of adaptation as well as the mechanism underlying this
effect.
Future projects aim at the investigation of
- the precise relationship between the perception of synchrony and multimodal integration,
- the influence of prior knowledge about a common origin of signals on the perception of synchrony
- the influence of timing on the perception of cause and effect
- the neural basis of the detection of synchrony
In conclusion, our research seeks to understand the mechanisms underlying temporal calibration between different
sensory modalities with the goal to identify factors that foster multimodal integration and, in turn, the sense of
presence.