Underwater image enhancement via multiscale disentanglement strategy

Abstract Underwater images suffer from color casts, low illumination, and blurred details caused by light absorption and scattering in water. Existing data-driven methods often overlook the scene characteristics of underwater imaging, limiting their expressive power. To address the above issues, we...

Full description

Saved in:
Bibliographic Details
Main Authors: Jiaquan Yan, Hao Hu, Yijian Wang, Muhammad Wasim Nawaz, Naveed Ur Rehman Junejo, Ente Guo, Huibin Feng
Format: Article
Language:English
Published: Nature Portfolio 2025-02-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-89109-7
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Underwater images suffer from color casts, low illumination, and blurred details caused by light absorption and scattering in water. Existing data-driven methods often overlook the scene characteristics of underwater imaging, limiting their expressive power. To address the above issues, we propose a Multiscale Disentanglement Network (MD-Net) for Underwater Image Enhancement (UIE), which mainly consists of scene radiance disentanglement (SRD) and transmission map disentanglement (TMD) modules. Specifically, MD-Net first disentangles original images into three physical parameters which are scene radiance (clear image), transmission map, and global background light. The proposed network then reconstructs these physical parameters into underwater images. Furthermore, MD-Net introduces class adversarial learning between the original and reconstructed images to supervise the disentanglement accuracy of the network. Moreover, we design a multi-level fusion module (MFM) and dual-layer weight estimation unit (DWEU) for color cast adjustment and visibility enhancement. Finally, we conduct extensive qualitative and quantitative experiments on three benchmark datasets, which demonstrate that our approach outperforms other traditional and state-of-the-art methods. Our code and results are available at: https://github.com/WYJGR/MD-Net .
ISSN:2045-2322