Binocular stereo vision-based relative positioning algorithm for drone swarm

Abstract To address the challenges of high computational complexity and poor real-time performance in binocular vision-based Unmanned Aerial Vehicle (UAV) formation flight, this paper introduces a UAV localization algorithm based on a lightweight object detection model. Firstly, we optimized the YOL...

Full description

Saved in:
Bibliographic Details
Main Authors: Qing Cheng, Yazhe Wang
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-86981-1
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832571723892916224
author Qing Cheng
Yazhe Wang
author_facet Qing Cheng
Yazhe Wang
author_sort Qing Cheng
collection DOAJ
description Abstract To address the challenges of high computational complexity and poor real-time performance in binocular vision-based Unmanned Aerial Vehicle (UAV) formation flight, this paper introduces a UAV localization algorithm based on a lightweight object detection model. Firstly, we optimized the YOLOv5s model using lightweight design principles, resulting in Yolo-SGN. This model achieves a 65.5% reduction in parameter count, a 62.7% reduction in FLOPs, and a 1.8% increase in accuracy compared to the original detection model. Subsequently, Yolo-SGN is utilized to extract target regions from binocular images, and feature point matching is exclusively conducted within these regions to minimize unnecessary computations in non-target areas. Experimental results demonstrate that the combination of Yolo-SGN and the Oriented FAST and Rotated BRIEF (ORB) algorithm reduces feature point matching computations to only a quarter of those in the original ORB algorithm, significantly enhancing real-time performance. Finally, the extracted feature points from UAVs are input into a binocular vision localization model to compute their three-dimensional coordinates. The average of the three-dimensional coordinates of all feature points is used to determine the three-dimensional position of the target UAV. Experimental results confirm that the UAV binocular vision localization algorithm, based on a lightweight object detection model, exhibits exceptional performance in terms of precision and real-time capabilities.
format Article
id doaj-art-59ba89eadccb43fca305f372917bb8f3
institution Kabale University
issn 2045-2322
language English
publishDate 2025-01-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-59ba89eadccb43fca305f372917bb8f32025-02-02T12:20:37ZengNature PortfolioScientific Reports2045-23222025-01-0115112110.1038/s41598-025-86981-1Binocular stereo vision-based relative positioning algorithm for drone swarmQing Cheng0Yazhe Wang1School of Air Traffic Management, Civil Aviation Flight University of ChinaSchool of Air Traffic Management, Civil Aviation Flight University of ChinaAbstract To address the challenges of high computational complexity and poor real-time performance in binocular vision-based Unmanned Aerial Vehicle (UAV) formation flight, this paper introduces a UAV localization algorithm based on a lightweight object detection model. Firstly, we optimized the YOLOv5s model using lightweight design principles, resulting in Yolo-SGN. This model achieves a 65.5% reduction in parameter count, a 62.7% reduction in FLOPs, and a 1.8% increase in accuracy compared to the original detection model. Subsequently, Yolo-SGN is utilized to extract target regions from binocular images, and feature point matching is exclusively conducted within these regions to minimize unnecessary computations in non-target areas. Experimental results demonstrate that the combination of Yolo-SGN and the Oriented FAST and Rotated BRIEF (ORB) algorithm reduces feature point matching computations to only a quarter of those in the original ORB algorithm, significantly enhancing real-time performance. Finally, the extracted feature points from UAVs are input into a binocular vision localization model to compute their three-dimensional coordinates. The average of the three-dimensional coordinates of all feature points is used to determine the three-dimensional position of the target UAV. Experimental results confirm that the UAV binocular vision localization algorithm, based on a lightweight object detection model, exhibits exceptional performance in terms of precision and real-time capabilities.https://doi.org/10.1038/s41598-025-86981-1Unmanned aerial vehicle detectionLightweight networkBinocular stereo visionDeep learning
spellingShingle Qing Cheng
Yazhe Wang
Binocular stereo vision-based relative positioning algorithm for drone swarm
Scientific Reports
Unmanned aerial vehicle detection
Lightweight network
Binocular stereo vision
Deep learning
title Binocular stereo vision-based relative positioning algorithm for drone swarm
title_full Binocular stereo vision-based relative positioning algorithm for drone swarm
title_fullStr Binocular stereo vision-based relative positioning algorithm for drone swarm
title_full_unstemmed Binocular stereo vision-based relative positioning algorithm for drone swarm
title_short Binocular stereo vision-based relative positioning algorithm for drone swarm
title_sort binocular stereo vision based relative positioning algorithm for drone swarm
topic Unmanned aerial vehicle detection
Lightweight network
Binocular stereo vision
Deep learning
url https://doi.org/10.1038/s41598-025-86981-1
work_keys_str_mv AT qingcheng binocularstereovisionbasedrelativepositioningalgorithmfordroneswarm
AT yazhewang binocularstereovisionbasedrelativepositioningalgorithmfordroneswarm