English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Multisensory self-motion estimation

Smith, S., Butler, J., & Bülthoff, H. (2006). Multisensory self-motion estimation. Talk presented at 36th Annual Meeting of the Society for Neuroscience (Neuroscience 2006). Atlanta, GA, USA.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Smith, ST1, Author           
Butler, JS2, Author           
Bülthoff, HH2, Author           
Affiliations:
1Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              
2Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              

Content

show
hide
Free keywords: -
 Abstract: Navigation through the environment is a naturally multisensory task involving a coordinated set of sensorimotor processes that encode and compare information from visual, vestibular, proprioceptive, motor-corollary, and cognitive inputs. The extent to which visual information dominates this process is no better demonstrated than by the compelling illusion of self-motion generated in the stationary participant by a large-field visual motion stimuli. The importance of visual inputs for estimation of self-motion direction (heading) was first recognised by Gibson (1950) who postulated that heading could be recovered by locating the focus of expansion (FOE) of the radially expanding optic flow field coincident with forward translation. A number of behavioural studies have subsequently shown that humans are able to estimate their heading to within a few degrees using optic flow and other visual cues. For simple linear translation without eye or head rotations, Warren and Hannon (1988) report accurate discrimination of visual heading direction of about 1.5°. Despite the importance of visual information in such tasks, self-motion also involves stimulation of the vestibular end-organs which provide information about the angular and linear accelerations of the head. Our research (Smith et al 2004) has previously shown that humans with intact vestibular function can estimate their direction of linear translation using vestibular cues alone with as much certainty as they do using visual cues. Here we report the results of an ongoing investigation of self-motion estimation which shows that visual and vestibular information can be combined in a statistically optimal fashion. We discuss our results from the perspective that successful execution of self-motion behaviour requires the computation of one’s own spatial orientation relative to the environment.

Details

show
hide
Language(s):
 Dates: 2006-10
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: URI: http://www.sfn.org/index.aspx?pagename=abstracts_ampublications
BibTex Citekey: 4506
 Degree: -

Event

show
hide
Title: 36th Annual Meeting of the Society for Neuroscience (Neuroscience 2006)
Place of Event: Atlanta, GA, USA
Start-/End Date: -

Legal Case

show

Project information

show

Source

show