Robust Multi-Input Multi-Output Analysis for Crop Row Segmentation and Furrow Line Detection in Diverse Agricultural Fields
Agricultural robots have transformed traditional crop cultivation and harvesting by addressing labor shortages and enhancing precision and quality through artificial intelligence. Precise detection of furrow centerlines is critical for the seamless navigation of these robots. Previous methods often...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11058970/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Agricultural robots have transformed traditional crop cultivation and harvesting by addressing labor shortages and enhancing precision and quality through artificial intelligence. Precise detection of furrow centerlines is critical for the seamless navigation of these robots. Previous methods often focused on specific furrow types, leading to issues with generalization and adaptability. This paper introduces a comprehensive deep-learning model designed to detect furrow centerlines across diverse types of furrows, thereby improving accuracy and robustness. Our algorithm utilizes RGB and depth images, processed through a dual encoder that combines their features. These features are refined through a channel-limiting network and then enhanced by Deep Multi-scale Feature Fusion (DFF), which maintains feature correlations across different scales. Finally, two interlinked decoders re-utilize the multiscale features to compute segmentations and lines. Our model significantly outperforms state-of-the-art methods, achieving a minimal lateral distance deviation of just 7.8 pixels, well within an acceptable range for agricultural robotics and achieves a detection line ratio <inline-formula> <tex-math notation="LaTeX">$(mLR)$ </tex-math></inline-formula> of 71.13%. Additionally, under a multi-task learning setup, our approach yields over a 10% improvement in <italic>mIOU</italic> for the furrow segmentation task. Our results demonstrate that the proposed model is robust and adaptable to various environments and conditions, ensuring reliable furrow navigation for agricultural robots. |
|---|---|
| ISSN: | 2169-3536 |