English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  SceneGen: Automated 3D Scene Generation for Psychophysical Experiments

Franz, G., von der Heyde, M., & Bülthoff, H. (2003). SceneGen: Automated 3D Scene Generation for Psychophysical Experiments. Poster presented at 6. Tübinger Wahrnehmungskonferenz (TWK 2003), Tübingen, Germany.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Franz, G1, Author           
von der Heyde, M1, Author           
Bülthoff, HH1, Author           
Bülthoff, H.H., Editor
Gegenfurtner, K.R., Editor
Mallot, H.A., Editor
Ulrich, R., Editor
Wichmann, F. A., Editor
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              

Content

show
hide
Free keywords: -
 Abstract: For a systematic investigation of the perception of real spaces, photographs oer a chance to combine pictorial realism with laboratory experimental conditions. Psychophysical methods, however, often need a large variety of fully controlled stimuli, which is dicult to achieve with photographs of real scenes. Virtual scenes, on the other hand, provide the necessary exibility, but their generation by hand is usually too labor-intensive for larger quantities. Our SceneGen toolbox is capable to integrate the advantages of both in a fully automated process. SceneGen combines the good pictorial quality of photo textures, a physics-based radiosity lighting simulation (POVRay renderer), and the complete and convenient control of a high level feature-oriented XML-based description language. Thus, all scene features and rendering parameters are independently adjustable. External objects or scene parts can be integrated via a VRML interface. All this allows for an automated generation of an unlimited number of 3D multi-textured realtime-capable OpenGL models or panoramic images with exactly dened dierences. The applicability of the scenes as psychophysical stimuli is demonstrated by our current work on the in uence of view parameters on distance estimates and semantic dierential ratings in virtual reality. Nine subjects in two groups rated two sets of 20 precomputed rectangular interiors. The rooms diered in dimensions, proportions and the number and form of openings in similar ranges like real rooms, but had identical surface properties and illumination. The results show a signicant eect of the main experimental parameter eyepoint height on perceived egocentric distances as well as on allocentric distances perpendicular to gaze direction. Surprisingly, allocentric distance estimates parallel to gaze direction are not signicantly in uenced. This suggests that the participants' horizontal self-location is aected by the simulated eyepoint height. Our experimental paradigm allowed us to investigate spatial perception solely depending on pictorial cues under fully controlled but diverse and comparatively natural conditions. SceneGen is expected to be especially useful for the eld of empirical research touching the disciplines of architecture, virtual reality and perceptual psychophysics.

Details

show
hide
Language(s):
 Dates: 2003-02
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: URI: http://www.twk.tuebingen.mpg.de/twk03/
BibTex Citekey: 2089
 Degree: -

Event

show
hide
Title: 6. Tübinger Wahrnehmungskonferenz (TWK 2003)
Place of Event: Tübingen, Germany
Start-/End Date: -

Legal Case

show

Project information

show

Source

show