Hybrid Vision System for Enhanced Situational Awareness in Unmanned Surface Vehicles: Decision-Level Camera-LiDAR Fusion With Supervised and Unsupervised Approaches
The integration of LiDAR and cameras with an efficient data fusion approach significantly improves Enhanced Situational Awareness (ESA) for small-sized Unmanned Surface Vehicles (USVs). This is critical for early obstacle detection, particularly when maritime obstacles are only 5 to 10 seconds away....
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10969632/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | The integration of LiDAR and cameras with an efficient data fusion approach significantly improves Enhanced Situational Awareness (ESA) for small-sized Unmanned Surface Vehicles (USVs). This is critical for early obstacle detection, particularly when maritime obstacles are only 5 to 10 seconds away. This is important to ensure effective collision avoidance in regions without reliable Global Positioning System (GPS) and Automatic Identification System (AIS), while maintaining low computational requirements and real-time performance. This paper presents a system aimed at ESA using an unsupervised approach and a trained deep learning model to detect multiple maritime obstacles in denied GPS zones. The proposed system incorporates a technical mechanism that uses a compact hardware system with an integrated advanced computing module and sensors for small-sized USVs. It addresses the challenges of achieving peak performance on an edge machine learning computer integration to reduce computational overhead. Further, it minimizes temporal detection variations using sophisticated filters and clustering in dynamic maritime environments with synchronized LiDAR and camera data fusion. The detection model, trained using Maritime Federated Large Dataset (MFLD2), achieved over 99% operational accuracy with the proposed data fusion approach. The system’s capacity to precisely identify obstacle location and distance is validated by experimental findings, enabling real-time situational awareness. |
|---|---|
| ISSN: | 2169-3536 |