A two-stage HDR reconstruction pipeline for extreme dark-light RGGB images

Abstract RGGB sensor arrays are commonly used in digital cameras and mobile photography. However, images of extreme dark-light conditions often suffer from insufficient exposure because the sensor receives insufficient light. The existing methods mainly employ U-Net variants, multi-stage camera para...

Full description

Saved in:
Bibliographic Details
Main Authors: Yiyao Huang, Xiaobao Zhu, Fenglian Yuan, Jing Shi, U. Kintak, Jingfei Fu, Yiran Peng, Chenheng Deng
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-87412-x
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract RGGB sensor arrays are commonly used in digital cameras and mobile photography. However, images of extreme dark-light conditions often suffer from insufficient exposure because the sensor receives insufficient light. The existing methods mainly employ U-Net variants, multi-stage camera parameter simulation, or image parameter processing to address this issue. However, those methods usually apply color adjustments evenly across the entire image, which may cause extensive blue or green noise artifacts, especially in images with dark backgrounds. This study attacks the problem by proposing a novel multi-step process for image enhancement. The pipeline starts with a self-attention U-Net for initial color restoration and applies a Color Correction Matrix (CCM). Thereafter, High Dynamic Range (HDR) image reconstruction techniques are utilized to improve exposure using various Camera Response Functions (CRFs). After removing under- and over-exposed frames, pseudo-HDR images are created through multi-frame fusion. Also, a comparative analysis is conducted based on a standard dataset, and the results show that the proposed approach performs better in creating well-exposed images and improves the Peak-Signal-to-Noise Ratio (PSNR) by 0.16 dB compared to the benchmark methods.
ISSN:2045-2322