A Meta-Learning-Based Recognition Method for Multidimensional Feature Extraction and Fusion of Underwater Targets

To tackle the challenges of relative attitude adaptability and limited sample availability in underwater moving target recognition for active sonar, this study focuses on key aspects such as feature extraction, network model design, and information fusion. A pseudo-three-dimensional spatial feature...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiaochun Liu, Yunchuan Yang, Youfeng Hu, Xiangfeng Yang, Liwen Liu, Lei Shi, Jianguo Liu
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/10/5744
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To tackle the challenges of relative attitude adaptability and limited sample availability in underwater moving target recognition for active sonar, this study focuses on key aspects such as feature extraction, network model design, and information fusion. A pseudo-three-dimensional spatial feature extraction method is proposed by integrating generalized MUSIC with range–dimension information. The pseudo-WVD time–frequency feature is enhanced through the incorporation of prior knowledge. Additionally, the Doppler frequency shift distribution feature for underwater moving targets is derived and extracted. A multidimensional feature information fusion network model based on meta-learning is developed. Meta-knowledge is extracted separately from spatial, time–frequency, and Doppler feature spectra, to improve the generalization capability of single-feature task networks during small-sample training. Multidimensional feature information fusion is achieved via a feature fusion classifier. Finally, a sample library is constructed using simulation-enhanced data and experimental data for network training and testing. The results demonstrate that, in the few-sample scenario, the proposed method leverages the complementary nature of multidimensional features, effectively addressing the challenge of limited adaptability to relative horizontal orientation angles in target recognition, and achieving a recognition accuracy of up to 97.1%.
ISSN:2076-3417