English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Book Chapter

Multisensory contributions to spatial perception

MPS-Authors
/persons/resource/persons84088

Mohler,  BJ
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83885

Di Luca,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Mohler, B., Di Luca, M., & Bülthoff, H. (2012). Multisensory contributions to spatial perception. In D. Waller, & L. Nadel (Eds.), Handbook of Spatial Cognition (pp. 81-97). Washington, DC, USA: American Psychological Association.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-B5CC-5
Abstract
How do we know where environmental objects are located with respect to our body? How are we are able to navigate, manipulate, and interact with the environment? In this chapter, we describe how capturing sensory signals from the environment and performing internal computations achieve such goals. The first step, called early or low-level processing, is based on the functioning of feature detectors that respond selectively to elementary patterns of stimulation. Separate organs capture sensory signals and then process them separately in what we normally refer to as senses: smell, taste, touch, audition, and vision. In the first section of this chapter, we present the sense modalities that provide sensory information for the perception of spatial properties such as distance, direction, and extent. Although it is hard to distinguish where early processing ends and high-level perception begins, the rest of the chapter focuses on the intermediate level of processing, which is implicitly assumed to be the a key component of several perceptual and computational theories (Gibson, 1979; Marr, 1982) and for the visual modality has been termed mid-level vision (see Nakayama, He, & Shimojo, 1995). In particular, we discuss the ability of the perceptual system to specify the position and orientation of environmental objects relative to other objects and especially relative to the observer’s body. We present computational theories and relevant scientific results on individual sense modalities and on the integration of sensory information within and across the sensory modalities. Finally, in the last section of this chapter, we describe how the information processing approach has enabled a better understanding of the perceptual processes in relation to two specific high-level perceptual functions: self-orientation perception and object recognition.