Background-Masked Lightweight Approach for Pear Leaf Disease Recognition
Deep learning for plant disease recognition faces challenges in real-world scenarios, especially on resource-constrained devices and with complex backgrounds. While deep neural networks excel with such images, lightweight approaches struggle due to the complexities in feature extraction, affecting c...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11098832/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Deep learning for plant disease recognition faces challenges in real-world scenarios, especially on resource-constrained devices and with complex backgrounds. While deep neural networks excel with such images, lightweight approaches struggle due to the complexities in feature extraction, affecting classification accuracy. To overcome this, we introduce a novel pipeline that masks and removes complex backgrounds, enabling accurate disease recognition from affected leaves. This approach also significantly increases model accuracy and reduces training time compared to baseline models due to object-cropped input. Using various lightweight neural networks as backbones for the classification task, we validated the efficiency of our approach, which outperformed the baseline in terms of performance. Specifically, our proposed background masking model achieved an IoU of 93.93% for background masking. Furthermore, for disease recognition, our approach using the SqueezeNet-1.1 backbone demonstrated the highest accuracy of 90.84%, surpassing the baseline’s highest average accuracy of 89.22%. These results demonstrate the efficacy of our proposed method in enhancing disease recognition in agricultural imagery, thereby contributing to the advancement of deep learning techniques in this domain. Moreover, the comparison of training times demonstrates that our approach achieved a considerable reduction compared to the benchmark. |
|---|---|
| ISSN: | 2169-3536 |