Framework for smartphone-based grape detection and vineyard management using UAV-trained AI

Viticulture benefits significantly from rapid grape bunch identification and counting, enhancing yield and quality. Recent technological and machine learning advancements, particularly in deep learning, have provided the tools necessary to create more efficient, automated processes that significantl...

Full description

Saved in:
Bibliographic Details
Main Authors: Sergio Vélez, Mar Ariza-Sentís, Mario Triviño, Antonio Carlos Cob-Parro, Miquel Mila, João Valente
Format: Article
Language:English
Published: Elsevier 2025-02-01
Series:Heliyon
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2405844025009053
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1823856738212773888
author Sergio Vélez
Mar Ariza-Sentís
Mario Triviño
Antonio Carlos Cob-Parro
Miquel Mila
João Valente
author_facet Sergio Vélez
Mar Ariza-Sentís
Mario Triviño
Antonio Carlos Cob-Parro
Miquel Mila
João Valente
author_sort Sergio Vélez
collection DOAJ
description Viticulture benefits significantly from rapid grape bunch identification and counting, enhancing yield and quality. Recent technological and machine learning advancements, particularly in deep learning, have provided the tools necessary to create more efficient, automated processes that significantly reduce the time and effort required for these tasks. On one hand, drone, or Unmanned Aerial Vehicles (UAV) imagery combined with deep learning algorithms has revolutionised agriculture by automating plant health classification, disease identification, and fruit detection. However, these advancements often remain inaccessible to farmers due to their reliance on specialized hardware like ground robots or UAVs. On the other hand, most farmers have access to smartphones. This article proposes a novel approach combining UAVs and smartphone technologies. An AI-based framework is introduced, integrating a 5-stage AI pipeline combining object detection and pixel-level segmentation algorithms to automatically detect grape bunches in smartphone images of a commercial vineyard with vertical trellis training. By leveraging UAV-captured data for training, the proposed model not only accelerates the detection process but also enhances the accuracy and adaptability of grape bunch detection across different devices, surpassing the efficiency of traditional and purely UAV-based methods. To this end, using a dataset of UAV videos recorded during early growth stages in July (BBCH77-BBCH79), the X-Decoder segments vegetation in the front of the frames from their background and surroundings. X-Decoder is particularly advantageous because it can be seamlessly integrated into the AI pipeline without requiring changes to how data is captured, making it more versatile than other methods. Then, YOLO is trained using the videos and further applied to images taken by farmers with common smartphones (Xiaomi Poco X3 Pro and iPhone X). In addition, a web app was developed to connect the system with mobile technology easily. The proposed approach achieved a precision of 0.92 and recall of 0.735, with an F1 score of 0.82 and an Average Precision (AP) of 0.802 under different operation conditions, indicating high accuracy and reliability in detecting grape bunches. In addition, the AI-detected grape bunches were compared with the actual ground truth, achieving an R2 value as high as 0.84, showing the robustness of the system. This study highlights the potential of using smartphone imaging and web applications together, making an effort to integrate these models into a real platform for farmers, offering a practical, affordable, accessible, and scalable solution. While smartphone-based image collection for model training is labour-intensive and costly, incorporating UAV data accelerates the process, facilitating the creation of models that generalise across diverse data sources and platforms. This blend of UAV efficiency and smartphone precision significantly cuts vineyard monitoring time and effort.
format Article
id doaj-art-5e1a2f3a323940b095486d7e59cb15f4
institution Kabale University
issn 2405-8440
language English
publishDate 2025-02-01
publisher Elsevier
record_format Article
series Heliyon
spelling doaj-art-5e1a2f3a323940b095486d7e59cb15f42025-02-12T05:31:25ZengElsevierHeliyon2405-84402025-02-01114e42525Framework for smartphone-based grape detection and vineyard management using UAV-trained AISergio Vélez0Mar Ariza-Sentís1Mario Triviño2Antonio Carlos Cob-Parro3Miquel Mila4João Valente5JRU Drone Technology, Department of Architectural Constructions and I.C.T., University of Burgos, Burgos, 09001, Spain; Information Technology Group, Wageningen University & Research, Wageningen, 6708 PB, the NetherlandsInformation Technology Group, Wageningen University & Research, Wageningen, 6708 PB, the Netherlands; Corresponding author.Atos IT Solutions and Services Iberia, 28037, Madrid, SpainAtos IT Solutions and Services Iberia, 28037, Madrid, SpainAtos IT Solutions and Services Iberia, 28037, Madrid, SpainInformation Technology Group, Wageningen University & Research, Wageningen, 6708 PB, the Netherlands; Centre for Automation and Robotics (CAR), Spanish National Research Council (CSIC), 28006, Madrid, SpainViticulture benefits significantly from rapid grape bunch identification and counting, enhancing yield and quality. Recent technological and machine learning advancements, particularly in deep learning, have provided the tools necessary to create more efficient, automated processes that significantly reduce the time and effort required for these tasks. On one hand, drone, or Unmanned Aerial Vehicles (UAV) imagery combined with deep learning algorithms has revolutionised agriculture by automating plant health classification, disease identification, and fruit detection. However, these advancements often remain inaccessible to farmers due to their reliance on specialized hardware like ground robots or UAVs. On the other hand, most farmers have access to smartphones. This article proposes a novel approach combining UAVs and smartphone technologies. An AI-based framework is introduced, integrating a 5-stage AI pipeline combining object detection and pixel-level segmentation algorithms to automatically detect grape bunches in smartphone images of a commercial vineyard with vertical trellis training. By leveraging UAV-captured data for training, the proposed model not only accelerates the detection process but also enhances the accuracy and adaptability of grape bunch detection across different devices, surpassing the efficiency of traditional and purely UAV-based methods. To this end, using a dataset of UAV videos recorded during early growth stages in July (BBCH77-BBCH79), the X-Decoder segments vegetation in the front of the frames from their background and surroundings. X-Decoder is particularly advantageous because it can be seamlessly integrated into the AI pipeline without requiring changes to how data is captured, making it more versatile than other methods. Then, YOLO is trained using the videos and further applied to images taken by farmers with common smartphones (Xiaomi Poco X3 Pro and iPhone X). In addition, a web app was developed to connect the system with mobile technology easily. The proposed approach achieved a precision of 0.92 and recall of 0.735, with an F1 score of 0.82 and an Average Precision (AP) of 0.802 under different operation conditions, indicating high accuracy and reliability in detecting grape bunches. In addition, the AI-detected grape bunches were compared with the actual ground truth, achieving an R2 value as high as 0.84, showing the robustness of the system. This study highlights the potential of using smartphone imaging and web applications together, making an effort to integrate these models into a real platform for farmers, offering a practical, affordable, accessible, and scalable solution. While smartphone-based image collection for model training is labour-intensive and costly, incorporating UAV data accelerates the process, facilitating the creation of models that generalise across diverse data sources and platforms. This blend of UAV efficiency and smartphone precision significantly cuts vineyard monitoring time and effort.http://www.sciencedirect.com/science/article/pii/S2405844025009053YOLOYield mappingPrecision agricultureVineyard managementReal-time detectionDigital agriculture
spellingShingle Sergio Vélez
Mar Ariza-Sentís
Mario Triviño
Antonio Carlos Cob-Parro
Miquel Mila
João Valente
Framework for smartphone-based grape detection and vineyard management using UAV-trained AI
Heliyon
YOLO
Yield mapping
Precision agriculture
Vineyard management
Real-time detection
Digital agriculture
title Framework for smartphone-based grape detection and vineyard management using UAV-trained AI
title_full Framework for smartphone-based grape detection and vineyard management using UAV-trained AI
title_fullStr Framework for smartphone-based grape detection and vineyard management using UAV-trained AI
title_full_unstemmed Framework for smartphone-based grape detection and vineyard management using UAV-trained AI
title_short Framework for smartphone-based grape detection and vineyard management using UAV-trained AI
title_sort framework for smartphone based grape detection and vineyard management using uav trained ai
topic YOLO
Yield mapping
Precision agriculture
Vineyard management
Real-time detection
Digital agriculture
url http://www.sciencedirect.com/science/article/pii/S2405844025009053
work_keys_str_mv AT sergiovelez frameworkforsmartphonebasedgrapedetectionandvineyardmanagementusinguavtrainedai
AT mararizasentis frameworkforsmartphonebasedgrapedetectionandvineyardmanagementusinguavtrainedai
AT mariotrivino frameworkforsmartphonebasedgrapedetectionandvineyardmanagementusinguavtrainedai
AT antoniocarloscobparro frameworkforsmartphonebasedgrapedetectionandvineyardmanagementusinguavtrainedai
AT miquelmila frameworkforsmartphonebasedgrapedetectionandvineyardmanagementusinguavtrainedai
AT joaovalente frameworkforsmartphonebasedgrapedetectionandvineyardmanagementusinguavtrainedai