de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

How does the brain identify living things based on their motion?

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84201

Schultz,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Schultz, J., & Bülthoff, H. (2009). How does the brain identify living things based on their motion?. Poster presented at 39th Annual Meeting of the Society for Neuroscience (Neuroscience 2009), Chicago, IL, USA.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-C2B0-9
Zusammenfassung
Animals (including humans) have to identify living moving things in the environment: these could be prey, enemies or mates and interactions with them should be actively controlled. Living things could be detected visually through their shape or their motion, or both. When shape is hard to see (fog, twilight, great distance, small animal), motion becomes an important cue. Biological motion has been studied widely using point-light displays, but these displays appear to contain some sort of shape or form information that influences recognition. To study the neural correlates of the detection of living entities from motion alone, we developed a stimulus consisting of a single moving dot, thus eliminating all possible sources of information about form, spatial arrangement, shape or structure of the object. Our single dot moved such that it appeared either self-propelled (modelled on the movements of a fly) or moved by an external force (modelled on a leaf drifting in the wind). Both types of movement were built using the same equation but differed in speed and acceleration profiles according to a small set of parameters. Low-level stimulus characteristics of the stimuli (range of positions on the screen, average speed, overall aspect of the trajectory) were kept as constant as possible. The parameters could be varied in a continuous fashion to create morphs between the self-propelled and externally-moved extremes. Consistent with expectations, behavioral experiments showed that self-propelled stimuli were perceived as more animate (= more likely to be alive) than the externally-moved stimuli, with a gradual transition occurring in the intermediary morphs. The extreme stimuli and four intermediary morphs were presented in an fMRI experiment to participants who had to categorize the stimuli into alive and non-alive. Using separate functional localizers, we located areas hMT+/V5 and the superior temporal sulcus region responding to point-light walkers, and found that neither region showed changes in BOLD response following the changes in percept. However, BOLD response in a region of the left posterior superior parietal cortex scaled with the degree of perceived animacy. This suggests that the STS is not simply a detector of all kinds of animate motion, but might only be implicated when some sort of shape information in the stimuli (as with point-light displays or with interacting dots) is contributing to the percept of animacy.