Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networks

Abstract Charcoal rot of sorghum (CRS) is a significant disease affecting sorghum crops, with limited genetic resistance available. The causative agent, Macrophomina phaseolina (Tassi) Goid, is a highly destructive fungal pathogen that targets over 500 plant species globally, including essential sta...

Full description

Saved in:
Bibliographic Details
Main Authors: Emmanuel M. Gonzalez, Ariyan Zarei, Sebastian Calleja, Clay Christenson, Bruno Rozzi, Jeffrey Demieville, Jiahuai Hu, Andrea L. Eveland, Brian Dilkes, Kobus Barnard, Eric Lyons, Duke Pauli
Format: Article
Language:English
Published: Wiley 2024-12-01
Series:Plant Phenome Journal
Online Access:https://doi.org/10.1002/ppj2.20110
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850253506982707200
author Emmanuel M. Gonzalez
Ariyan Zarei
Sebastian Calleja
Clay Christenson
Bruno Rozzi
Jeffrey Demieville
Jiahuai Hu
Andrea L. Eveland
Brian Dilkes
Kobus Barnard
Eric Lyons
Duke Pauli
author_facet Emmanuel M. Gonzalez
Ariyan Zarei
Sebastian Calleja
Clay Christenson
Bruno Rozzi
Jeffrey Demieville
Jiahuai Hu
Andrea L. Eveland
Brian Dilkes
Kobus Barnard
Eric Lyons
Duke Pauli
author_sort Emmanuel M. Gonzalez
collection DOAJ
description Abstract Charcoal rot of sorghum (CRS) is a significant disease affecting sorghum crops, with limited genetic resistance available. The causative agent, Macrophomina phaseolina (Tassi) Goid, is a highly destructive fungal pathogen that targets over 500 plant species globally, including essential staple crops. Utilizing field image data for precise detection and quantification of CRS could greatly assist in the prompt identification and management of affected fields and thereby reduce yield losses. The objective of this work was to implement various machine learning algorithms to evaluate their ability to accurately detect and quantify CRS in red‐green‐blue images of sorghum plants exhibiting symptoms of infection. EfficientNet‐B3 and a fully convolutional network emerged as the top‐performing models for image classification and segmentation tasks, respectively. Among the classification models evaluated, EfficientNet‐B3 demonstrated superior performance, achieving an accuracy of 86.97%, a recall rate of 0.71, and an F1 score of 0.73. Of the segmentation models tested, FCN proved to be the most effective, exhibiting a validation accuracy of 97.76%, a recall rate of 0.68, and an F1 score of 0.66. As the size of the image patches increased, both models’ validation scores increased linearly, and their inference time decreased exponentially. This trend could be attributed to larger patches containing more information, improving model performance, and fewer patches reducing the computational load, thus decreasing inference time. The models, in addition to being immediately useful for breeders and growers of sorghum, advance the domain of automated plant phenotyping and may serve as a foundation for drone‐based or other automated field phenotyping efforts. Additionally, the models presented herein can be accessed through a web‐based application where users can easily analyze their own images.
format Article
id doaj-art-ecf9f200262f481495bf7db082804683
institution OA Journals
issn 2578-2703
language English
publishDate 2024-12-01
publisher Wiley
record_format Article
series Plant Phenome Journal
spelling doaj-art-ecf9f200262f481495bf7db0828046832025-08-20T01:57:21ZengWileyPlant Phenome Journal2578-27032024-12-0171n/an/a10.1002/ppj2.20110Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networksEmmanuel M. Gonzalez0Ariyan Zarei1Sebastian Calleja2Clay Christenson3Bruno Rozzi4Jeffrey Demieville5Jiahuai Hu6Andrea L. Eveland7Brian Dilkes8Kobus Barnard9Eric Lyons10Duke Pauli11School of Plant Sciences University of Arizona Tucson Arizona USADepartment of Computer Science University of Arizona Tucson Arizona USASchool of Plant Sciences University of Arizona Tucson Arizona USASchool of Plant Sciences University of Arizona Tucson Arizona USASchool of Plant Sciences University of Arizona Tucson Arizona USASchool of Plant Sciences University of Arizona Tucson Arizona USASchool of Plant Sciences University of Arizona Tucson Arizona USADonald Danforth Plant Science Center St. Louis Missouri USADepartment of Biochemistry Purdue University West Lafayette Indiana USADepartment of Computer Science University of Arizona Tucson Arizona USASchool of Plant Sciences University of Arizona Tucson Arizona USASchool of Plant Sciences University of Arizona Tucson Arizona USAAbstract Charcoal rot of sorghum (CRS) is a significant disease affecting sorghum crops, with limited genetic resistance available. The causative agent, Macrophomina phaseolina (Tassi) Goid, is a highly destructive fungal pathogen that targets over 500 plant species globally, including essential staple crops. Utilizing field image data for precise detection and quantification of CRS could greatly assist in the prompt identification and management of affected fields and thereby reduce yield losses. The objective of this work was to implement various machine learning algorithms to evaluate their ability to accurately detect and quantify CRS in red‐green‐blue images of sorghum plants exhibiting symptoms of infection. EfficientNet‐B3 and a fully convolutional network emerged as the top‐performing models for image classification and segmentation tasks, respectively. Among the classification models evaluated, EfficientNet‐B3 demonstrated superior performance, achieving an accuracy of 86.97%, a recall rate of 0.71, and an F1 score of 0.73. Of the segmentation models tested, FCN proved to be the most effective, exhibiting a validation accuracy of 97.76%, a recall rate of 0.68, and an F1 score of 0.66. As the size of the image patches increased, both models’ validation scores increased linearly, and their inference time decreased exponentially. This trend could be attributed to larger patches containing more information, improving model performance, and fewer patches reducing the computational load, thus decreasing inference time. The models, in addition to being immediately useful for breeders and growers of sorghum, advance the domain of automated plant phenotyping and may serve as a foundation for drone‐based or other automated field phenotyping efforts. Additionally, the models presented herein can be accessed through a web‐based application where users can easily analyze their own images.https://doi.org/10.1002/ppj2.20110
spellingShingle Emmanuel M. Gonzalez
Ariyan Zarei
Sebastian Calleja
Clay Christenson
Bruno Rozzi
Jeffrey Demieville
Jiahuai Hu
Andrea L. Eveland
Brian Dilkes
Kobus Barnard
Eric Lyons
Duke Pauli
Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networks
Plant Phenome Journal
title Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networks
title_full Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networks
title_fullStr Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networks
title_full_unstemmed Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networks
title_short Quantifying leaf symptoms of sorghum charcoal rot in images of field‐grown plants using deep neural networks
title_sort quantifying leaf symptoms of sorghum charcoal rot in images of field grown plants using deep neural networks
url https://doi.org/10.1002/ppj2.20110
work_keys_str_mv AT emmanuelmgonzalez quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT ariyanzarei quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT sebastiancalleja quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT claychristenson quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT brunorozzi quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT jeffreydemieville quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT jiahuaihu quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT andrealeveland quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT briandilkes quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT kobusbarnard quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT ericlyons quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks
AT dukepauli quantifyingleafsymptomsofsorghumcharcoalrotinimagesoffieldgrownplantsusingdeepneuralnetworks