An Improved YOLOv9s Algorithm for Underwater Object Detection

Monitoring marine life through underwater object detection technology serves as a primary means of understanding biodiversity and ecosystem health. However, the complex marine environment, poor resolution, color distortion in underwater optical imaging, and limited computational resources all affect...

Full description

Saved in:
Bibliographic Details
Main Authors: Shize Zhou, Long Wang, Zhuoqun Chen, Hao Zheng, Zhihui Lin, Li He
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Journal of Marine Science and Engineering
Subjects:
Online Access:https://www.mdpi.com/2077-1312/13/2/230
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Monitoring marine life through underwater object detection technology serves as a primary means of understanding biodiversity and ecosystem health. However, the complex marine environment, poor resolution, color distortion in underwater optical imaging, and limited computational resources all affect the accuracy and efficiency of underwater object detection. To solve these problems, the YOLOv9s-SD underwater target detection algorithm is proposed to improve the detection performance in underwater environments. We combine the inverted residual structure of MobileNetV2 with Simple Attention Module (SimAM) and Squeeze-and-Excitation Attention (SE) to form the Simple Enhancement attention Module (SME) and optimize AConv, improving the sensitivity of the model to object details. Furthermore, we introduce the lightweight DySample operator to optimize feature recovery, enabling better adaptation to the complex characteristics of underwater targets. Finally, we employ Wise-IoU version 3 (WIoU v3) as the loss function to balance the loss weights for targets of different sizes. In comparison with the YOLOv9s model, according to the experiments conducted on the UPRC and Brackish underwater datasets, YOLOv9s-SD achieves an improvement of 1.3% and 1.2% in the mean Average Precision (mAP), reaching 83.0% and 94.3% on the respective datasets and demonstrating better adaptability to intricate underwater environments.
ISSN:2077-1312