UAVs-Based Visual Localization via Attention-Driven Image Registration Across Varying Texture Levels

This study investigates the difficulties associated with image registration due to variations in perspective, lighting, and ground object details between images captured by drones and satellite imagery. This study proposes an image registration and drone visual localization algorithm based on an att...

Full description

Saved in:
Bibliographic Details
Main Authors: Yan Ren, Guohai Dong, Tianbo Zhang, Meng Zhang, Xinyu Chen, Mingliang Xue
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Drones
Subjects:
Online Access:https://www.mdpi.com/2504-446X/8/12/739
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study investigates the difficulties associated with image registration due to variations in perspective, lighting, and ground object details between images captured by drones and satellite imagery. This study proposes an image registration and drone visual localization algorithm based on an attention mechanism. Initially, an improved Oriented FAST and Rotated BRIEF (ORB) algorithm incorporating a quadtree-based feature point homogenization method is designed to extract image feature points, providing support for the initial motion estimation of UAVs. Following this, we combined a convolutional neural network with an attention mechanism and the inverse-combined Lucas-Kanade method to further extract image features. This integration facilitates the efficient registration of drone images with satellite tiles. Finally, we utilized the registration results to correct the initial motion of the drone and accurately determine its location. Our experimental findings indicate that the proposed algorithm achieves an average absolute positioning error of less than 40 m for low-texture flight paths and under 10 m for high-texture paths. This significantly mitigates the positioning challenges that arise from inconsistencies between drone images and satellite maps. Moreover, our method demonstrates a notable improvement in computational speed compared to existing algorithms.
ISSN:2504-446X