de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Surface-slant judgment from texture and motion

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84925

Rosas,  P
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Rosas, P., Zaenen, P., & Wagemans, J. (2003). Surface-slant judgment from texture and motion. Poster presented at 26th European Conference on Visual Perception, Paris, France.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-DBE2-0
Abstract
We have previously observed systematic differences in slant perception by means of probe adjustment for different types of synthetic texture (Rosas et al, 2002 Perception 31 Supplement, 27). These results led to a rank-order of textures according to the correlation between the depicted and perceived slants, when a particular texture was mapped on the surface. Textures composed of circles tended to allow the best slant judgments, followed by a leopard-skin-like pattern, then by a 'coherent' noise, and finally a fractal noise inducing the worst performance. Here we compare those results with an experiment in which subjects were asked to judge the slant of textured flat planes that could have two types of motion: translation of the planes in the vertical direction (parallel to the viewing direction) and horizontal direction (orthogonal to the viewing direction). Our results show that the performance in most cases was better for the moving planes than the performance for static planes, for all types of textures and both types of motion. The perceived slant of the moving planes was comparable for both types of motion. A simple summation of cue information would predict a similar enhancement of performance for all types of texture, which was not observed. A cue combination in which the influence of a particular cue is related to its reliability, such as the modified weak fusion model (Landy et al, 1995 Vision Research 35 389 - 412), would predict different influences of the texture types on the combined perception following the rank-order mentioned previously. We did not observe such gradual enhancement of performance, though the largest performance enhancement was observed for the least reliable texture.