A Comparison of Segmentation Methods for Semantic OctoMap Generation
Semantic mapping plays a critical role in enabling autonomous vehicles to understand and navigate complex environments. Instead of computationally demanding 3D segmentation of point clouds, we propose efficient segmentation on RGB images and projection of the corresponding LIDAR measurements on the...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-06-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/13/7285 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Semantic mapping plays a critical role in enabling autonomous vehicles to understand and navigate complex environments. Instead of computationally demanding 3D segmentation of point clouds, we propose efficient segmentation on RGB images and projection of the corresponding LIDAR measurements on the semantic OctoMap. This study presents a comparative evaluation of different semantic segmentation methods and examines the impact of input image resolution on the accuracy of 3D semantic environment reconstruction, inference time, and computational resource usage. The experiments were conducted using an ROS 2-based pipeline that combines RGB images and LiDAR point clouds. Semantic segmentation is performed using ONNX-exported deep neural networks, with class predictions projected onto corresponding 3D LiDAR data using calibrated extrinsic. The resulting semantically annotated point clouds are fused into a probabilistic 3D representation using an OctoMap, where each voxel stores both occupancy and semantic class information. Multiple encoder–decoder architectures with various backbone configurations are evaluated in terms of segmentation quality, latency, memory footprint, and GPU utilization. Furthermore, a comparison between high and low image resolutions is conducted to assess trade-offs between model accuracy and real-time applicability. |
|---|---|
| ISSN: | 2076-3417 |