de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Integrating Visual and Haptic Shape Information to Form a Multimodal Perceptual Space

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83925

Gaissert,  N
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Gaissert, N., & Wallraven, C. (2011). Integrating Visual and Haptic Shape Information to Form a Multimodal Perceptual Space. In IEEE World Haptics Conference (WHC 2011) (pp. 451-456). Piscataway, NJ, USA: IEEE.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-BB76-4
Zusammenfassung
In this study we want to address the question to what extent the visual and the haptic modalities contribute to the final formation of a complex multisensory perceptual space. By varying three shape parameters a physical shape space of shell-like objects was generated. Participants were allowed to either see or touch the objects or use both senses to explore the objects. Similarity ratings were performed and analyzed using multidimensional scaling (MDS) techniques. By comparing the unimodal perceptual spaces to the multimodal perceptual space we tried to resolve the impact of the visual and the haptic modalities on the combined percept. We found that neither the visual nor the haptic modality dominated the final percept, but rather that the two modalities contributed to the combined percept almost equally. To investigate to which degree these results are transferrable to natural objects, we performed the same visual, haptic, and visuo-haptic similarity ratings and multidimensional scaling analyses using a set of natural sea shells. Again, we found almost equal contributions of the visual and the haptic modalities to the combined percept. Our results suggest that multisensory perceptual spaces are based on a complex combination of object information gathered by different senses.