Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle

Crop damage caused by wild animals, particularly wild boars (<i>Sus scrofa</i>), significantly impacts agricultural yields, especially in maize fields. This study evaluates two methods for assessing maize crop damage using UAV-acquired data: (1) a deep learning-based approach employing t...

Full description

Saved in:
Bibliographic Details
Main Authors: Barbara Dobosz, Dariusz Gozdowski, Jerzy Koronczok, Jan Žukovskis, Elżbieta Wójcik-Gront
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Agronomy
Subjects:
Online Access:https://www.mdpi.com/2073-4395/15/1/238
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832589406817484800
author Barbara Dobosz
Dariusz Gozdowski
Jerzy Koronczok
Jan Žukovskis
Elżbieta Wójcik-Gront
author_facet Barbara Dobosz
Dariusz Gozdowski
Jerzy Koronczok
Jan Žukovskis
Elżbieta Wójcik-Gront
author_sort Barbara Dobosz
collection DOAJ
description Crop damage caused by wild animals, particularly wild boars (<i>Sus scrofa</i>), significantly impacts agricultural yields, especially in maize fields. This study evaluates two methods for assessing maize crop damage using UAV-acquired data: (1) a deep learning-based approach employing the Deepness plugin in QGIS, utilizing high-resolution RGB imagery; and (2) a method based on digital surface models (DSMs) derived from LiDAR data. Manual visual assessment, supported by ground-truthing, served as the reference for validating these methods. This study was conducted in 2023 in a maize field in Central Poland, where UAV flights captured high-resolution RGB imagery and LiDAR data. Results indicated that the DSM-based method achieved higher accuracy (94.7%) and sensitivity (69.9%) compared to the deep learning method (accuracy: 92.9%, sensitivity: 35.3%), which exhibited higher precision (92.2%) and specificity (99.7%). The DSM-based method provided a closer estimation of the total damaged area (9.45% of the field) compared to the reference (10.50%), while the deep learning method underestimated damage (4.01%). Discrepancies arose from differences in how partially damaged areas were classified; the deep learning approach excluded these zones, focusing on fully damaged areas. The findings suggest that while DSM-based methods are well-suited for quantifying extensive damage, deep learning techniques detect only completely damaged crop areas. Combining these methods could enhance the accuracy and efficiency of crop damage assessments. Future studies should explore integrated approaches across diverse crop types and damage patterns to optimize wild animal damage evaluation.
format Article
id doaj-art-695d61a9bd8c4905a7bb4f197973b06f
institution Kabale University
issn 2073-4395
language English
publishDate 2025-01-01
publisher MDPI AG
record_format Article
series Agronomy
spelling doaj-art-695d61a9bd8c4905a7bb4f197973b06f2025-01-24T13:17:17ZengMDPI AGAgronomy2073-43952025-01-0115123810.3390/agronomy15010238Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial VehicleBarbara Dobosz0Dariusz Gozdowski1Jerzy Koronczok2Jan Žukovskis3Elżbieta Wójcik-Gront4Department of Biometry, Institute of Agriculture, Warsaw University of Life Sciences, Nowoursynowska 159, 02-776 Warsaw, PolandDepartment of Biometry, Institute of Agriculture, Warsaw University of Life Sciences, Nowoursynowska 159, 02-776 Warsaw, PolandAgrocom Polska, Strzelecka 47, 47-120 Żędowice, PolandDepartment of Business and Rural Development Management, Vytautas Magnus University, 53361 Kaunas, LithuaniaDepartment of Biometry, Institute of Agriculture, Warsaw University of Life Sciences, Nowoursynowska 159, 02-776 Warsaw, PolandCrop damage caused by wild animals, particularly wild boars (<i>Sus scrofa</i>), significantly impacts agricultural yields, especially in maize fields. This study evaluates two methods for assessing maize crop damage using UAV-acquired data: (1) a deep learning-based approach employing the Deepness plugin in QGIS, utilizing high-resolution RGB imagery; and (2) a method based on digital surface models (DSMs) derived from LiDAR data. Manual visual assessment, supported by ground-truthing, served as the reference for validating these methods. This study was conducted in 2023 in a maize field in Central Poland, where UAV flights captured high-resolution RGB imagery and LiDAR data. Results indicated that the DSM-based method achieved higher accuracy (94.7%) and sensitivity (69.9%) compared to the deep learning method (accuracy: 92.9%, sensitivity: 35.3%), which exhibited higher precision (92.2%) and specificity (99.7%). The DSM-based method provided a closer estimation of the total damaged area (9.45% of the field) compared to the reference (10.50%), while the deep learning method underestimated damage (4.01%). Discrepancies arose from differences in how partially damaged areas were classified; the deep learning approach excluded these zones, focusing on fully damaged areas. The findings suggest that while DSM-based methods are well-suited for quantifying extensive damage, deep learning techniques detect only completely damaged crop areas. Combining these methods could enhance the accuracy and efficiency of crop damage assessments. Future studies should explore integrated approaches across diverse crop types and damage patterns to optimize wild animal damage evaluation.https://www.mdpi.com/2073-4395/15/1/238crop damagewild boar<i>Sus scrofa</i>maizeUAVRGB imagery
spellingShingle Barbara Dobosz
Dariusz Gozdowski
Jerzy Koronczok
Jan Žukovskis
Elżbieta Wójcik-Gront
Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle
Agronomy
crop damage
wild boar
<i>Sus scrofa</i>
maize
UAV
RGB imagery
title Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle
title_full Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle
title_fullStr Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle
title_full_unstemmed Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle
title_short Detection of Crop Damage in Maize Using Red–Green–Blue Imagery and LiDAR Data Acquired Using an Unmanned Aerial Vehicle
title_sort detection of crop damage in maize using red green blue imagery and lidar data acquired using an unmanned aerial vehicle
topic crop damage
wild boar
<i>Sus scrofa</i>
maize
UAV
RGB imagery
url https://www.mdpi.com/2073-4395/15/1/238
work_keys_str_mv AT barbaradobosz detectionofcropdamageinmaizeusingredgreenblueimageryandlidardataacquiredusinganunmannedaerialvehicle
AT dariuszgozdowski detectionofcropdamageinmaizeusingredgreenblueimageryandlidardataacquiredusinganunmannedaerialvehicle
AT jerzykoronczok detectionofcropdamageinmaizeusingredgreenblueimageryandlidardataacquiredusinganunmannedaerialvehicle
AT janzukovskis detectionofcropdamageinmaizeusingredgreenblueimageryandlidardataacquiredusinganunmannedaerialvehicle
AT elzbietawojcikgront detectionofcropdamageinmaizeusingredgreenblueimageryandlidardataacquiredusinganunmannedaerialvehicle