de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Exploring connections between similarity and categorization using vision and touch

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83865

Cooke,  T
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83992

Jäkel,  F
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Cooke, T., Jäkel, F., Wallraven, C., & Bülthoff, H. (2006). Exploring connections between similarity and categorization using vision and touch. Poster presented at 9th Tübingen Perception Conference (TWK 2006), Tübingen, Germany.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D297-4
Abstract
Similarity has been proposed as a fundamental principle underlying the formation of category structures, however, much of the research involving perceptual similarity and categorization has focused on a single modality, usually vision. Can the notion of similarity still provide a basis for explaining categorization when objects are perceived with one or more modalities? We addressed these questions by having subjects see, touch, or both see and touch novel, 3D objects which varied parametrically in shape and texture. Then, they performed a pair-wise similarity rating task and a free sorting categorization task. Using multidimensional scaling (MDS), we found that that a single underlying perceptual map whose dimensions corresponded to shape and texture could explain visual, haptic, and bimodal similarity ratings. However, the relative weights of the map’s dimensions varied according to modality: shape dominated texture when objects were seen, whereas shape and texture were equally important in the haptic and bimodal conditions. We found some evidence for a connection between similarity and categorization in a multimodal context: the probability of category membership increased with similarity, while the probability of a category boundary being placed between two stimuli decreased with similarity. Moreover, the relative weight accorded to shape and texture varied in the same way for both tasks when modality was changed. The study also demonstrates how 3D printing technology and MDS techniques can be fruitfully applied in the study of visuohaptic object processing.