de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Moving objects in ultra-rapid visual categorisation result in better accuracy, but slower reaction times than static presentations

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84291

Vuong,  QC
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84258

Thornton,  IM
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Kirchner, H., Vuong, Q., Thorpe, S., & Thornton, I. (2005). Moving objects in ultra-rapid visual categorisation result in better accuracy, but slower reaction times than static presentations. Poster presented at 8th Tübingen Perception Conference (TWK 2005), Tübingen, Germany.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D659-E
Abstract
Ultra-rapid categorisation studies have analysed human responses to briefly flashed, static natural scenes in order to determine the time needed to process different kinds of visual objects. Recently, Kirchner and Thorpe reported that reaction times can be extremely fast if subjects are asked to move their eyes to the side where an animal had appeared. Accuracy was remarkably good with the fastest reliable saccades occurring in only 130 ms after stimulus onset. Vuong and colleagues in a 2AFC task with apparent motion displays and manual responses further showed that humans can be detected more easily than machines. In the present study we combined the two approaches in order to determine the processing speed of static vs. dynamic displays. In blocked conditions, human subjects were asked to detect either an animal or a machine which in half of the trials were presented either static or in apparent motion. On each trial, an animal and a machine were presented simultaneously on the left and right of fixation, and the subjects were asked to make a saccade or to press a button at the target side. Manual responses and saccadic eye movements both resulted in good accuracy, while reaction times to animals were significantly faster than to machines. Only saccadic eye movements showed a clear advantage of dynamic over static trials in accuracy, but the analysis of mean reaction times pointed to a speed-accuracy trade-off. This might be explained by different response modes as seen in the latency distributions. We conclude that form processing can be improved by stimulus motion, but the speed of this process can be observed much more directly in eye movement latencies as compared to manual responses.