Event/Visual/IMU Integration for UAV-Based Indoor Navigation
Unmanned aerial vehicle (UAV) navigation in indoor environments is challenging due to varying light conditions, the dynamic clutter typical of indoor spaces, and the absence of GNSS signals. In response to these complexities, emerging sensors, such as event cameras, demonstrate significant potential...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-12-01
|
| Series: | Proceedings |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2504-3900/110/1/2 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Unmanned aerial vehicle (UAV) navigation in indoor environments is challenging due to varying light conditions, the dynamic clutter typical of indoor spaces, and the absence of GNSS signals. In response to these complexities, emerging sensors, such as event cameras, demonstrate significant potential in indoor navigation with their low latency and high dynamic range characteristics. Unlike traditional RGB cameras, event cameras mitigate motion blur and operate effectively in low-light conditions. Nevertheless, they exhibit limitations in terms of information output during scenarios of limited motion, in contrast to standard cameras that can capture detailed surroundings. This study proposes a novel event-based visual–inertial odometry approach for precise indoor navigation. In the proposed approach, the standard images are leveraged for feature detection and tracking, while events are aggregated into frames to track features between consecutive standard frames. The fusion of IMU measurements and feature tracks facilitates the continuous estimation of sensor states. The proposed approach is evaluated and validated using a controlled office environment simulation developed using Gazebo, employing a P230 simulated drone equipped with an event camera, an RGB camera, and IMU sensors. This simulated environment provides a testbed for evaluating and showcasing the proposed approach’s robust performance in realistic indoor navigation scenarios. |
|---|---|
| ISSN: | 2504-3900 |