DEMNet: A Small Object Detection Method for Tea Leaf Blight in Slightly Blurry UAV Remote Sensing Images

Unmanned aerial vehicles are widely used in agricultural disease detection. Still, slight image blurring caused by lighting, wind, and flight instability often hampers the detection of dense small targets like tea leaf blight spots. In response to this problem, this paper proposes DEMNet, a model ba...

Full description

Saved in:
Bibliographic Details
Main Authors: Yating Gu, Yuxin Jing, Hao-Dong Li, Juntao Shi, Haifeng Lin
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/17/12/1967
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Unmanned aerial vehicles are widely used in agricultural disease detection. Still, slight image blurring caused by lighting, wind, and flight instability often hampers the detection of dense small targets like tea leaf blight spots. In response to this problem, this paper proposes DEMNet, a model based on the YOLOv8n architecture. The goal is to enhance small, blurry object detection performance in UAV-based scenarios. DEMNet introduces a dynamic convolution mechanism into the HGNetV2 backbone to form DynamicHGNetV2, enabling adaptive convolutional weight generation and improving feature extraction for blurry objects. An efficient EMAFPN neck structure further facilitates deep–shallow feature interaction while reducing the computational cost. Additionally, a novel CMLAB module replaces the traditional C2f structure, employing multi-scale convolutions and local attention mechanisms to recover semantic information in blurry regions and better detect densely distributed small targets. Experimental results on a slightly blurry tea leaf blight dataset demonstrate that DEMNet surpasses the baseline by 5.7% in recall and 4.9% in mAP@0.5. Moreover, the model reduces parameters to 1.7 M, computation to 6.1 GFLOPs, and model size to 4.2 MB, demonstrating high accuracy and strong deployment potential.
ISSN:2072-4292