de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Combination and Integration in the Perception of Visual-Haptic Compliance Information

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83885

Di Luca,  M
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84748

Klatzky,  RL
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Kuschel, M., Di Luca, M., Buss, M., & Klatzky, R. (2010). Combination and Integration in the Perception of Visual-Haptic Compliance Information. IEEE Transactions On Haptics, 3(4), 234-244. doi:10.1109/TOH.2010.9.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-BDB6-2
Abstract
The compliance of a material can be conveyed through mechanical interactions in a virtual environment and perceived through both visual and haptic cues.We investigated this basic aspect of perception. In two experiments subjects performed compliance discriminations, and the mean perceptual estimate (PSE) and and the perceptual standard deviation (proportional to JND) were derived from psychophysical functions. Experiment 1 supported a model in which each modality acted independently to produce a compliance estimate, and the two estimates were then integrated to produce an overall value. Experiment 2 tested three mathematical models of the integration process. The data ruled out exclusive reliance on the more reliable modality and stochastic selection of one modality. Instead the results supported an integration process that constitutes a weighted summation of two random variables, which are defined by the single modality estimates. The model subsumes optimal fusion but provided valid predictions also if the weights were not optimal. Weights were optimal (i.e., minimized variance) when vision and haptic inputs were congruent, but not when they were incongruent.