“Image-Spectral” fusion monitoring of peanut leaf spot disease level based on deep learning

Leaf disease is one of the primary causes of decreased peanut yield and quality. Rapid and accurate identification of the peanut leaf disease infest stage is essential for producers to implement effective and expeditious control measures. However, existing algorithms for peanut leaf disease detectio...

Full description

Saved in:
Bibliographic Details
Main Authors: Yongda Lin, Jiangtao Tan, Hong Li, Xi Li, Jianguo Wang, Tingting Chen, Lei Zhang
Format: Article
Language:English
Published: Elsevier 2025-12-01
Series:Smart Agricultural Technology
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2772375525005465
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Leaf disease is one of the primary causes of decreased peanut yield and quality. Rapid and accurate identification of the peanut leaf disease infest stage is essential for producers to implement effective and expeditious control measures. However, existing algorithms for peanut leaf disease detection techniques based on pictures or hyperspectral information still have problems with detection accuracy and stability. To address these limitations, this study proposes a robust multi-source feature fusion model for peanut leaf spot detection, integrating ResNet101 for RGB image feature extraction and an improved 1D-CNN for hyperspectral feature extraction. The model simultaneously processes RGB and spectral data as inputs, both of which are efficiently loaded through a unified dataloader. The architecture consists of four key components: ResNet101 as the RGB feature extractor, the improved 1D-CNN for spectral features, a feature fusion module, and a final classification layer that outputs the disease severity level. Experimental results demonstrate that the proposed multimodal model achieves a detection accuracy of 91.80 %, outperforming the unimodal RGB-based ResNet101 and hyperspectral-based 1D-CNN by 5.12 % and 6.56 %, respectively. RGB images offer high-resolution macroscopic indicators such as color, shape, and texture, while hyperspectral reflectance captures early-stage physiological and biochemical changes invisible to the naked eye. The fusion of these complementary modalities enables more accurate identification of peanut leaf spot severity. This work not only highlights the effectiveness of multimodal data integration but also provides a promising approach for precision disease management and targeted pesticide application in peanut cultivation.
ISSN:2772-3755