Enhancing Image Classification of Cabbage Plant Diseases Using a Hybrid Model Convolutional Neural Network and XGBoost

Classifying imbalanced datasets presents significant challenges, often leading to biased model performance, particularly in multiclass classification. This study addresses these issues by integrating Convolutional Neural Networks (CNN) and XGBoost, leveraging CNN’s exceptional feature extraction cap...

Full description

Saved in:
Bibliographic Details
Main Authors: Nabila Ayunda Sovia, Ni Wayan Surya Wardhani, Eni Sumarminingsih, Elvo Ramadhan Shofa
Format: Article
Language:English
Published: Mathematics Department UIN Maulana Malik Ibrahim Malang 2025-03-01
Series:Cauchy: Jurnal Matematika Murni dan Aplikasi
Subjects:
Online Access:https://ejournal.uin-malang.ac.id/index.php/Math/article/view/30866
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Classifying imbalanced datasets presents significant challenges, often leading to biased model performance, particularly in multiclass classification. This study addresses these issues by integrating Convolutional Neural Networks (CNN) and XGBoost, leveraging CNN’s exceptional feature extraction capabilities and XGBoost's robust handling of imbalanced data. The Hybrid CNN-XGBoost model was applied to classify cabbage plants affected by pests and diseases, which are categorized into five classes, with a significant imbalance between healthy and affected plants. The dataset, characterized by severe class imbalance, was effectively handled by the proposed model. A comparative analysis demonstrated that the CNN-XGBoost approach, with a Balanced Accuracy of 0.93 compared to 0.53 for the standalone CNN, significantly outperformed the standalone model, particularly for minority class predictions. This approach not only enhances the accuracy of plant disease and pest diagnosis but also provides a practical solution for farmers to efficiently identify and classify cabbage plants, contributing to more effective agricultural management.
ISSN:2086-0382
2477-3344