D-YOLO: A Lightweight Model for Strawberry Health Detection

In complex agricultural settings, accurately and rapidly identifying the growth and health conditions of strawberries remains a formidable challenge. Therefore, this study aims to develop a deep framework, Disease-YOLO (D-YOLO), based on the YOLOv8s model to monitor the health status of strawberries...

Full description

Saved in:
Bibliographic Details
Main Authors: Enhui Wu, Ruijun Ma, Daming Dong, Xiande Zhao
Format: Article
Language:English
Published: MDPI AG 2025-03-01
Series:Agriculture
Subjects:
Online Access:https://www.mdpi.com/2077-0472/15/6/570
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In complex agricultural settings, accurately and rapidly identifying the growth and health conditions of strawberries remains a formidable challenge. Therefore, this study aims to develop a deep framework, Disease-YOLO (D-YOLO), based on the YOLOv8s model to monitor the health status of strawberries. Key innovations include (1) replacing the original backbone with MobileNetv3 to optimize computational efficiency; (2) implementing a Bidirectional Feature Pyramid Network for enhanced multi-scale feature fusion; (3) integrating Contextual Transformer attention modules in the neck network to improve lesion localization; and (4) adopting weighted intersection over union loss to address class imbalance. Evaluated on our custom strawberry disease dataset containing 1301 annotated images across three fruit development stages and five plant health states, D-YOLO achieved 89.6% mAP on the train set and 90.5% mAP on the test set while reducing parameters by 72.0% and floating-point operations by 75.1% compared to baseline YOLOv8s. The framework’s balanced performance and computational efficiency surpass conventional models including Faster R-CNN, RetinaNet, YOLOv5s, YOLOv6s, and YOLOv8s in comparative trials. Cross-domain validation on a maize disease dataset demonstrated D-YOLO’s superior generalization with 94.5% mAP, outperforming YOLOv8 by 0.6%. The framework’s balanced performance (89.6% training mAP) and computational efficiency surpass conventional models, including Faster R-CNN, RetinaNet, YOLOv5s, YOLOv6s, and YOLOv8s, in comparative trials. This lightweight solution enables precise, real-time crop health monitoring. The proposed architectural improvements provide a practical paradigm for intelligent disease detection in precision agriculture.
ISSN:2077-0472