English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Multimodal Similarity and Categorization of Novel, Three-Dimensional Objects

MPS-Authors
/persons/resource/persons83865

Cooke,  T
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83992

Jäkel,  F
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Cooke, T., Jäkel, F., Wallraven, C., & Bülthoff, H. (2007). Multimodal Similarity and Categorization of Novel, Three-Dimensional Objects. Neuropsychologia, 45(3), 484-495. doi:10.1016/j.neuropsychologia.2006.02.009.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-CEA9-A
Abstract
Similarity has been proposed as a fundamental principle underlying mental object
representations and capable of supporting cognitive-level tasks such as categorization.
However, much of the research has considered connections between similarity and
categorization for tasks performed using a single perceptual modality. Considering
similarity and categorization within a multimodal context opens up a number of important
questions: Are the similarities between objects the same when they are perceived using
different modalities or using more than one modality at a time? Is similarity still able to
explain categorization performance when objects are experienced multimodally? In this
study, we addressed these questions by having subjects explore novel, 3D objects which
varied parametrically in shape and texture using vision alone, touch alone, or touch and
vision together. Subjects then performed a pair-wise similarity rating task and a free
sorting categorization task. Multidimensional scaling (MDS) analysis of similarity data
revealed that a single underlying perceptual map whose dimensions corresponded to shape
and texture could explain visual, haptic, and bimodal similarity ratings. However, the
relative dimension weights varied according to modality: shape dominated texture when
objects were seen, whereas shape and texture were roughly equally important in the
haptic and bimodal conditions. Some evidence was found for a multimodal connection
between similarity and categorization: the probability of category membership increased
with similarity while the probability of a category boundary being placed between two
stimuli decreased with similarity. In addition, dimension weights varied according to
modality in the same way for both tasks. The study also demonstrates the usefulness of
3D printing technology and MDS techniques in the study of visuohaptic object processing.