Prof. Dr. Karol Myszkowski (Max Planck Institut Saarbrücken)

Perceptually-based global illumination, rendering, and animation techniques.

Many improvements in image synthesis and computer animation have resulted from exploitation upon human perceptual capacities and insensitivities. This is because it is the appearance of the resulting images that is of primary relevance to the successful development and deployment of new algorithms for image synthesis and computer animation. In this study, we investigate the applications of the perceptually-based Visual Difference Predictor (VDP) developed by Daly to guide the realistic rendering computation.

First, we discuss the results of our psychophysical experiments involving human observers, which were designed specifically to validate the performance of this general purpose predictor in the tasks typical for the global illumination computation. Then we show an example of the VDP application to evaluate progressive changes in image quality at various stages of the global illumination computation. The VDP responses are used to support off-line decisions regarding the selection of the best technique from a pool of complementary algorithms, which at a given stage of computations minimize the perceivable differences between the intermediate and final images. Using this approach we are able to provide the high quality images of complex environments within single minutes or seconds using physically-based partial solutions.

Next, we discuss the extensions of VDP required to measure the quality of animated sequences. We use the resulting Animation Quality Metric (AQM) to develop an efficient antialiasing technique handling both still and animated images. Our antialiasing solution is based on a motion-compensated filtering, and the filter parameters have been tuned using the AQM predictions of animation quality as perceived by the human observer. These parameters adapt locally to the visual pattern velocity.