Neural James-Stein Combiner for Unbiased and Biased Renderings

Loading...
Thumbnail Image

Identifiers

Publication date

Authors

Gu, Jeongmin
Moon, Bochang

Advisors

Other responsabilities

Journal Title

Bibliographic citation

Jeongmin Gu, Jose A. Iglesias-Guitian, and Bochang Moon. 2022. Neural James-Stein Combiner for Unbiased and Biased Renderings. ACM Trans. Graph. 41, 6, Article 262 (December 2022), 14 pages. https://doi.org/10.1145/3550454.3555496

Type of academic work

Academic degree

Abstract

[Abstract]: Unbiased rendering algorithms such as path tracing produce accurate images given a huge number of samples, but in practice, the techniques often leave visually distracting artifacts (i.e., noise) in their rendered images due to a limited time budget. A favored approach for mitigating the noise problem is applying learning-based denoisers to unbiased but noisy rendered images and suppressing the noise while preserving image details. However, such denoising techniques typically introduce a systematic error, i.e., the denoising bias, which does not decline as rapidly when increasing the sample size, unlike the other type of error, i.e., variance. It can technically lead to slow numerical convergence of the denoising techniques. We propose a new combination framework built upon the James-Stein (JS) estimator, which merges a pair of unbiased and biased rendering images, e.g., a path-traced image and its denoised result. Unlike existing post-correction techniques for image denoising, our framework helps an input denoiser have lower errors than its unbiased input without relying on accurate estimation of per-pixel denoising errors. We demonstrate that our framework based on the well-established JS theories allows us to improve the error reduction rates of state-of-the-art learning-based denoisers more robustly than recent post-denoisers

Description

This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in https://doi.org/10.1145/3550454.3555496."

Rights

© 2022 Copyright held by the owner/author(s). Publication rights licensed to ACM