Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion

This paper presents an enhanced Simultaneous Localization and Mapping (SLAM) framework for mobile robot navigation. It integrates RGB-D cameras and 2D LiDAR sensors to improve both mapping accuracy and localization efficiency. We propose a data fusion strategy where RGB-D point clouds are projected...

Full description

Saved in:
Bibliographic Details
Main Authors: Basheer Al-Tawil, Adem Candemir, Magnus Jung, Ayoub Al-Hamadi
Format: Article
Language:English
Published: MDPI AG 2025-04-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/8/2408
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850180040819474432
author Basheer Al-Tawil
Adem Candemir
Magnus Jung
Ayoub Al-Hamadi
author_facet Basheer Al-Tawil
Adem Candemir
Magnus Jung
Ayoub Al-Hamadi
author_sort Basheer Al-Tawil
collection DOAJ
description This paper presents an enhanced Simultaneous Localization and Mapping (SLAM) framework for mobile robot navigation. It integrates RGB-D cameras and 2D LiDAR sensors to improve both mapping accuracy and localization efficiency. We propose a data fusion strategy where RGB-D point clouds are projected into 2D and denoised alongside LiDAR data. Late fusion is applied to combine the processed data, making it ready for use in the SLAM system. Additionally, we propose the enhanced Gmapping (EGM) algorithm by adding adaptive resampling and degeneracy handling to address particle depletion issues, thereby improving the robustness of the localization process. The system is evaluated through simulations and a small-scale real-world implementation using a Tiago robot. In simulations, the system was tested in environments of varying complexity and compared against state-of-the-art methods such as RTAB-Map SLAM and our EGM. Results show general improvements in navigation compared to state-of-the-art approaches: in simulation, an 8% reduction in traveled distance, a 13% reduction in processing time, and a 15% improvement in goal completion. In small-scale real-world tests, the EGM showed slight improvements over the classical GM method: a 3% reduction in traveled distance and a 9% decrease in execution time.
format Article
id doaj-art-41e236aa006a45c48c55d84f1d3eca26
institution OA Journals
issn 1424-8220
language English
publishDate 2025-04-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj-art-41e236aa006a45c48c55d84f1d3eca262025-08-20T02:18:20ZengMDPI AGSensors1424-82202025-04-01258240810.3390/s25082408Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor FusionBasheer Al-Tawil0Adem Candemir1Magnus Jung2Ayoub Al-Hamadi3Neuro-Information Technology, Otto-von-Guericke-University Magdeburg, 39106 Magdeburg, GermanyNeuro-Information Technology, Otto-von-Guericke-University Magdeburg, 39106 Magdeburg, GermanyNeuro-Information Technology, Otto-von-Guericke-University Magdeburg, 39106 Magdeburg, GermanyNeuro-Information Technology, Otto-von-Guericke-University Magdeburg, 39106 Magdeburg, GermanyThis paper presents an enhanced Simultaneous Localization and Mapping (SLAM) framework for mobile robot navigation. It integrates RGB-D cameras and 2D LiDAR sensors to improve both mapping accuracy and localization efficiency. We propose a data fusion strategy where RGB-D point clouds are projected into 2D and denoised alongside LiDAR data. Late fusion is applied to combine the processed data, making it ready for use in the SLAM system. Additionally, we propose the enhanced Gmapping (EGM) algorithm by adding adaptive resampling and degeneracy handling to address particle depletion issues, thereby improving the robustness of the localization process. The system is evaluated through simulations and a small-scale real-world implementation using a Tiago robot. In simulations, the system was tested in environments of varying complexity and compared against state-of-the-art methods such as RTAB-Map SLAM and our EGM. Results show general improvements in navigation compared to state-of-the-art approaches: in simulation, an 8% reduction in traveled distance, a 13% reduction in processing time, and a 15% improvement in goal completion. In small-scale real-world tests, the EGM showed slight improvements over the classical GM method: a 3% reduction in traveled distance and a 9% decrease in execution time.https://www.mdpi.com/1424-8220/25/8/2408SLAMlocalization gmapping algorithmnavigationdata fusionpoint cloud
spellingShingle Basheer Al-Tawil
Adem Candemir
Magnus Jung
Ayoub Al-Hamadi
Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
Sensors
SLAM
localization gmapping algorithm
navigation
data fusion
point cloud
title Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
title_full Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
title_fullStr Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
title_full_unstemmed Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
title_short Mobile Robot Navigation with Enhanced 2D Mapping and Multi-Sensor Fusion
title_sort mobile robot navigation with enhanced 2d mapping and multi sensor fusion
topic SLAM
localization gmapping algorithm
navigation
data fusion
point cloud
url https://www.mdpi.com/1424-8220/25/8/2408
work_keys_str_mv AT basheeraltawil mobilerobotnavigationwithenhanced2dmappingandmultisensorfusion
AT ademcandemir mobilerobotnavigationwithenhanced2dmappingandmultisensorfusion
AT magnusjung mobilerobotnavigationwithenhanced2dmappingandmultisensorfusion
AT ayoubalhamadi mobilerobotnavigationwithenhanced2dmappingandmultisensorfusion