de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

What is an inter-sensory object? Optimal combination of vision and touch depends on thier spatial coincidence

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84838

Burge,  J
Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84889

Banks,  MS
Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Gepshtein, S., Burge, J., Banks, M., & Ernst, M. (2004). What is an inter-sensory object? Optimal combination of vision and touch depends on thier spatial coincidence. Poster presented at Fourth Annual Meeting of the Vision Sciences Society (VSS 2004), Sarasota, FL, USA.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-D875-D
Zusammenfassung
Recent work showed that humans combine visual and haptic information about object size in a way that approaches statistical optimality: The precision of combined estimates is higher than with vision or touch alone (Ernst Banks, 2002; Gepshtein Banks, 2003). If the brain combines the visual and haptic signals optimally when they appear to come from the same object, the precision of combination should be greater when the signals originate from the same location in space. We examined this by varying the spatial offset between the visual and haptic stimuli. In a 2-IFC procedure, each interval contained visual and haptic stimuli, spatially superimposed or separated by up to 10 cm. The visual stimuli were random-dot stereograms of two parallel surfaces; the haptic stimuli were two parallel surfaces created by force-feedback devices. Observers indicated the interval containing the greater perceived inter-surface distance. The increase in precision with two cues as opposed to one cue should be greatest when visual and haptic weights are equal, so we equated the weights for each observer by finding the surface slant at which vision and haptics were equally precise (Gepshtein Banks, 2003). We found that inter-modality, just-noticeable differences (JND) for object size grew as a function of spatial separation between the visual and haptic stimuli. With no separation, JNDs were close to optimal. With large separations, JNDs worsened. We examined whether this effect of spatial coincidence is affected by scene layout; for example, when the lack of coincidence is “explained” by occlusion of the haptic stimulus.