Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream Analysis
Crop field monitoring using unmanned aerial vehicles (UAVs) is one of the most important technologies for plant growth control in modern precision agriculture. One of the important and widely used tasks in field monitoring is plant stand counting. The accurate identification of plants in field image...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Journal of Imaging |
Subjects: | |
Online Access: | https://www.mdpi.com/2313-433X/11/1/28 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832588293328338944 |
---|---|
author | Mikhail V. Kozhekin Mikhail A. Genaev Evgenii G. Komyshev Zakhar A. Zavyalov Dmitry A. Afonnikov |
author_facet | Mikhail V. Kozhekin Mikhail A. Genaev Evgenii G. Komyshev Zakhar A. Zavyalov Dmitry A. Afonnikov |
author_sort | Mikhail V. Kozhekin |
collection | DOAJ |
description | Crop field monitoring using unmanned aerial vehicles (UAVs) is one of the most important technologies for plant growth control in modern precision agriculture. One of the important and widely used tasks in field monitoring is plant stand counting. The accurate identification of plants in field images provides estimates of plant number per unit area, detects missing seedlings, and predicts crop yield. Current methods are based on the detection of plants in images obtained from UAVs by means of computer vision algorithms and deep learning neural networks. These approaches depend on image spatial resolution and the quality of plant markup. The performance of automatic plant detection may affect the efficiency of downstream analysis of a field cropping pattern. In the present work, a method is presented for detecting the plants of five species in images acquired via a UAV on the basis of image segmentation by deep learning algorithms (convolutional neural networks). Twelve orthomosaics were collected and marked at several sites in Russia to train and test the neural network algorithms. Additionally, 17 existing datasets of various spatial resolutions and markup quality levels from the Roboflow service were used to extend training image sets. Finally, we compared several texture features between manually evaluated and neural-network-estimated plant masks. It was demonstrated that adding images to the training sample (even those of lower resolution and markup quality) improves plant stand counting significantly. The work indicates how the accuracy of plant detection in field images may affect their cropping pattern evaluation by means of texture characteristics. For some of the characteristics (GLCM mean, GLRM long run, GLRM run ratio) the estimates between images marked manually and automatically are close. For others, the differences are large and may lead to erroneous conclusions about the properties of field cropping patterns. Nonetheless, overall, plant detection algorithms with a higher accuracy show better agreement with the estimates of texture parameters obtained from manually marked images. |
format | Article |
id | doaj-art-a2833e09a2ed45a8aed9c7284e8b7a5e |
institution | Kabale University |
issn | 2313-433X |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Journal of Imaging |
spelling | doaj-art-a2833e09a2ed45a8aed9c7284e8b7a5e2025-01-24T13:36:20ZengMDPI AGJournal of Imaging2313-433X2025-01-011112810.3390/jimaging11010028Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream AnalysisMikhail V. Kozhekin0Mikhail A. Genaev1Evgenii G. Komyshev2Zakhar A. Zavyalov3Dmitry A. Afonnikov4Institute of Cytology and Genetics, Siberian Branch of Russian Academy of Sciences, 630090 Novosibirsk, RussiaInstitute of Cytology and Genetics, Siberian Branch of Russian Academy of Sciences, 630090 Novosibirsk, RussiaInstitute of Cytology and Genetics, Siberian Branch of Russian Academy of Sciences, 630090 Novosibirsk, RussiaGeosAero LLC, 440000 Penza, RussiaInstitute of Cytology and Genetics, Siberian Branch of Russian Academy of Sciences, 630090 Novosibirsk, RussiaCrop field monitoring using unmanned aerial vehicles (UAVs) is one of the most important technologies for plant growth control in modern precision agriculture. One of the important and widely used tasks in field monitoring is plant stand counting. The accurate identification of plants in field images provides estimates of plant number per unit area, detects missing seedlings, and predicts crop yield. Current methods are based on the detection of plants in images obtained from UAVs by means of computer vision algorithms and deep learning neural networks. These approaches depend on image spatial resolution and the quality of plant markup. The performance of automatic plant detection may affect the efficiency of downstream analysis of a field cropping pattern. In the present work, a method is presented for detecting the plants of five species in images acquired via a UAV on the basis of image segmentation by deep learning algorithms (convolutional neural networks). Twelve orthomosaics were collected and marked at several sites in Russia to train and test the neural network algorithms. Additionally, 17 existing datasets of various spatial resolutions and markup quality levels from the Roboflow service were used to extend training image sets. Finally, we compared several texture features between manually evaluated and neural-network-estimated plant masks. It was demonstrated that adding images to the training sample (even those of lower resolution and markup quality) improves plant stand counting significantly. The work indicates how the accuracy of plant detection in field images may affect their cropping pattern evaluation by means of texture characteristics. For some of the characteristics (GLCM mean, GLRM long run, GLRM run ratio) the estimates between images marked manually and automatically are close. For others, the differences are large and may lead to erroneous conclusions about the properties of field cropping patterns. Nonetheless, overall, plant detection algorithms with a higher accuracy show better agreement with the estimates of texture parameters obtained from manually marked images.https://www.mdpi.com/2313-433X/11/1/28cropfield imageplant countingUAVdeep learningsemantic segmentation |
spellingShingle | Mikhail V. Kozhekin Mikhail A. Genaev Evgenii G. Komyshev Zakhar A. Zavyalov Dmitry A. Afonnikov Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream Analysis Journal of Imaging crop field image plant counting UAV deep learning semantic segmentation |
title | Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream Analysis |
title_full | Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream Analysis |
title_fullStr | Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream Analysis |
title_full_unstemmed | Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream Analysis |
title_short | Plant Detection in RGB Images from Unmanned Aerial Vehicles Using Segmentation by Deep Learning and an Impact of Model Accuracy on Downstream Analysis |
title_sort | plant detection in rgb images from unmanned aerial vehicles using segmentation by deep learning and an impact of model accuracy on downstream analysis |
topic | crop field image plant counting UAV deep learning semantic segmentation |
url | https://www.mdpi.com/2313-433X/11/1/28 |
work_keys_str_mv | AT mikhailvkozhekin plantdetectioninrgbimagesfromunmannedaerialvehiclesusingsegmentationbydeeplearningandanimpactofmodelaccuracyondownstreamanalysis AT mikhailagenaev plantdetectioninrgbimagesfromunmannedaerialvehiclesusingsegmentationbydeeplearningandanimpactofmodelaccuracyondownstreamanalysis AT evgeniigkomyshev plantdetectioninrgbimagesfromunmannedaerialvehiclesusingsegmentationbydeeplearningandanimpactofmodelaccuracyondownstreamanalysis AT zakharazavyalov plantdetectioninrgbimagesfromunmannedaerialvehiclesusingsegmentationbydeeplearningandanimpactofmodelaccuracyondownstreamanalysis AT dmitryaafonnikov plantdetectioninrgbimagesfromunmannedaerialvehiclesusingsegmentationbydeeplearningandanimpactofmodelaccuracyondownstreamanalysis |