A Recognition Method for Marigold Picking Points Based on the Lightweight SCS-YOLO-Seg Model
Accurate identification of picking points remains a critical challenge for automated marigold harvesting, primarily due to complex backgrounds and significant pose variations of the flowers. To overcome this challenge, this study proposes SCS-YOLO-Seg, a novel method based on a lightweight segmentat...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-08-01
|
| Series: | Sensors |
| Subjects: | |
| Online Access: | https://www.mdpi.com/1424-8220/25/15/4820 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Accurate identification of picking points remains a critical challenge for automated marigold harvesting, primarily due to complex backgrounds and significant pose variations of the flowers. To overcome this challenge, this study proposes SCS-YOLO-Seg, a novel method based on a lightweight segmentation model. The approach enhances the baseline YOLOv8n-seg architecture by replacing its backbone with StarNet and introducing C2f-Star, a novel lightweight feature extraction module. These modifications achieve substantial model compression, significantly reducing the model size, parameter count, and computational complexity (GFLOPs). Segmentation efficiency is further optimized through a dual-path collaborative architecture (Seg-Marigold head). Following mask extraction, picking points are determined by intersecting the optimized elliptical mask fitting results with the stem skeleton. Experimental results demonstrate that SCS-YOLO-Seg effectively balances model compression with segmentation performance. Compared to YOLOv8n-seg, it maintains high accuracy while significantly reducing resource requirements, achieving a picking point identification accuracy of 93.36% with an average inference time of 28.66 ms per image. This work provides a robust and efficient solution for vision systems in automated marigold harvesting. |
|---|---|
| ISSN: | 1424-8220 |