Foveated Denoising for Ray Tracing Rendering
Foveated graphics allocate computational load in light of the non-uniform nature of the human visual system (HVS), accelerating the scene rendering of virtual reality (VR). Although the asymmetrical spatial sensitivity of the HVS, such as visual acuity, is widely recognized and exploited to improve...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10804172/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Foveated graphics allocate computational load in light of the non-uniform nature of the human visual system (HVS), accelerating the scene rendering of virtual reality (VR). Although the asymmetrical spatial sensitivity of the HVS, such as visual acuity, is widely recognized and exploited to improve the rendering performance of VR, the variation of temporal contrast sensitivity in FOVs is less explored. In this paper, we quantify the non-uniform response of the HVS to flickering noise in the field of views (FOVs) by an SSIM-based model, where the sensitivity of each visual field is proportional to the tolerated variation of local SSIM per frame. Our pilot experiment reveals that denoising requirements vary significantly across the FOV, with central vision demanding a more stable SSIM variation. Based on these findings, a foveated denoising method is proposed. The central vision within 18.5° is rendered with deep learning (DL) based denoising, and the periphery is rendered with temporal anti-aliasing (TAA). A user study is conducted with VR scenes rendered by using ray tracing. The experiment results demonstrate the foveated denoising method provides perceptually comparable image quality to the global DL-based denoising method while saving at least 40.22% computational cost in VR. |
---|---|
ISSN: | 2169-3536 |