English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Perceptually Guided Corrective Splatting

MPS-Authors
/persons/resource/persons44557

Haber,  Jörg
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45095

Myszkowski,  Karol
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45769

Yamauchi,  Hitoshi
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45449

Seidel,  Hans-Peter       
Computer Graphics, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Haber, J., Myszkowski, K., Yamauchi, H., & Seidel, H.-P. (2001). Perceptually Guided Corrective Splatting. In A. Chalmers, & T.-M. Rhyne (Eds.), Proceedings of the Eurographics Conference 2001 (pp. 142-153). Oxford, UK: Blackwell.


Cite as: https://hdl.handle.net/11858/00-001M-0000-000F-32C0-1
Abstract
One of the basic difficulties with interactive walkthroughs is the high
quality rendering of object surfaces with non-diffuse light scattering
characteristics. Since full ray tracing at interactive rates is usually
impossible, we render a precomputed global illumination solution using
graphics hardware and use remaining computational power to correct the
appearance of non-diffuse objects on-the-fly. The question arises, how to
obtain the best image quality as perceived by a human observer within a
limited amount of time for each frame. We address this problem by
enforcing corrective computation for those non-diffuse objects that are
selected using a computational model of visual attention. We consider both
the saliency- and task-driven selection of those objects and benefit
from the fact that shading artifacts of ``unattended'' objects are likely
to remain unnoticed. We use a hierarchical image-space sampling scheme to
control ray tracing and splat the generated point samples. The resulting
image converges progressively to a ray traced solution if the viewing
parameters remain unchanged. Moreover, we use a sample cache to enhance
visual appearance if the time budget for correction has been too low for
some frame. We check the validity of the cached samples using a
novel criterion suited for non-diffuse surfaces and reproject valid
samples into the current view.