Adaptive Covariance Matrix for UAV-Based Visual–Inertial Navigation Systems Using Gaussian Formulas

In a variety of UAV applications, visual–inertial navigation systems (VINSs) play a crucial role in providing accurate positioning and navigation solutions. However, traditional VINS struggle to adapt flexibly to varying environmental conditions due to fixed covariance matrix settings. This limitati...

Full description

Saved in:
Bibliographic Details
Main Authors: Yangzi Cong, Wenbin Su, Nan Jiang, Wenpeng Zong, Long Li, Yan Xu, Tianhe Xu, Paipai Wu
Format: Article
Language:English
Published: MDPI AG 2025-08-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/15/4745
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In a variety of UAV applications, visual–inertial navigation systems (VINSs) play a crucial role in providing accurate positioning and navigation solutions. However, traditional VINS struggle to adapt flexibly to varying environmental conditions due to fixed covariance matrix settings. This limitation becomes especially acute during high-speed drone operations, where motion blur and fluctuating image clarity can significantly compromise navigation accuracy and system robustness. To address these issues, we propose an innovative adaptive covariance matrix estimation method for UAV-based VINS using Gaussian formulas. Our approach enhances the accuracy and robustness of the navigation system by dynamically adjusting the covariance matrix according to the quality of the images. Leveraging the advanced Laplacian operator, detailed assessments of image blur are performed, thereby achieving precise perception of image quality. Based on these assessments, a novel mechanism is introduced for dynamically adjusting the visual covariance matrix using a Gaussian model according to the clarity of images in the current environment. Extensive simulation experiments across the EuRoC and TUM VI datasets, as well as the field tests, have validated our method, demonstrating significant improvements in navigation accuracy of drones in scenarios with motion blur. Our algorithm has shown significantly higher accuracy compared to the famous VINS-Mono framework, outperforming it by 18.18% on average, as well as the optimization rate of RMS, which reaches 65.66% for the F1 dataset and 41.74% for F2 in the field tests outdoors.
ISSN:1424-8220