Weed detection in cabbage fields using RGB and NIR images

This article evaluates the effectiveness of integrating near-infrared (NIR) data with RGB imaging in enhancing weed detection and classification in real-time field settings using the YOLO deep learning model family. Data was gathered from sown weed plots and various locations across Bohemia to docum...

Full description

Saved in:
Bibliographic Details
Main Authors: Adam Hruška, Pavel Hamouz, Jakub Lev, Josef Pavlíček, Milan Kroulík, Kateřina Hamouzová, Pavlína Košnarová, Josef Holec, Pavel Kouřím
Format: Article
Language:English
Published: Elsevier 2025-12-01
Series:Smart Agricultural Technology
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2772375525004630
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This article evaluates the effectiveness of integrating near-infrared (NIR) data with RGB imaging in enhancing weed detection and classification in real-time field settings using the YOLO deep learning model family. Data was gathered from sown weed plots and various locations across Bohemia to document diverse plant phenotypes under different field conditions. A multispectral RGB+NIR camera combined with an LED flashlight system was used for imaging. Besides the cabbage crop, 13 weed classes were classified in the images using various YOLO models. The YOLOv10l model provided the best classification results. The use of RGB+NIR data in training resulted in the mean average precision (mAP@0.5) value of 94.9 %, compared to 94.5 % for RGB-only images, underscoring NIR’s benefits in weed detection. When calculated exclusively for sown species, mAP@0.5 of 97.8 % was achieved for RGB+NIR data. The addition of the NIR images not only increased the classification accuracy but also improved semi-automated annotation efficiency, facilitating faster dataset preparation. These results suggest that NIR-enhanced YOLOv10l holds potential for precision agriculture, enabling targeted interventions that reduce herbicide use. Future research will focus on expanding model adaptability and accessibility for broader agricultural applications.
ISSN:2772-3755