English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

What is an inter-sensory object? Optimal combination of vision and touch depends on thier spatial coincidence

MPS-Authors
/persons/resource/persons83906

Ernst,  MO
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Gepshtein, S., Burge, J., Banks, M., & Ernst, M. (2004). What is an inter-sensory object? Optimal combination of vision and touch depends on thier spatial coincidence. Poster presented at Fourth Annual Meeting of the Vision Sciences Society (VSS 2004), Sarasota, FL, USA.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-D875-D
Abstract
Recent work showed that humans combine visual and haptic information about object size in a way that approaches statistical optimality: The precision of combined estimates is higher than with vision or touch alone (Ernst Banks, 2002; Gepshtein Banks, 2003). If the brain combines the visual and haptic signals optimally when they appear to come from the same object, the precision of combination should be greater when the signals originate from the same location in space. We examined this by varying the spatial offset between the visual and haptic stimuli. In a 2-IFC procedure, each interval contained visual and haptic stimuli, spatially superimposed or separated by up to 10 cm. The visual stimuli were random-dot stereograms of two parallel surfaces; the haptic stimuli were two parallel surfaces created by force-feedback devices. Observers indicated the interval containing the greater perceived inter-surface distance. The increase in precision with two cues as opposed to one cue should be greatest when visual and haptic weights are equal, so we equated the weights for each observer by finding the surface slant at which vision and haptics were equally precise (Gepshtein Banks, 2003). We found that inter-modality, just-noticeable differences (JND) for object size grew as a function of spatial separation between the visual and haptic stimuli. With no separation, JNDs were close to optimal. With large separations, JNDs worsened. We examined whether this effect of spatial coincidence is affected by scene layout; for example, when the lack of coincidence is “explained” by occlusion of the haptic stimulus.