Covariance Matrix Reconstruction to Improve DoA Estimation Using Subspace Method in Low SNR Regime
Traditional Direction of Arrival (DoA) estimation methods, such as Multiple Signal Classification Algorithm (MUSIC), Root MUSIC (R-MUSIC), and Estimation of Signal Parameters via Rotational Invariance Techniques (ESPRIT), often suffer significant performance degradation in low Signal-to-Noise Ratio...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10836697/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Traditional Direction of Arrival (DoA) estimation methods, such as Multiple Signal Classification Algorithm (MUSIC), Root MUSIC (R-MUSIC), and Estimation of Signal Parameters via Rotational Invariance Techniques (ESPRIT), often suffer significant performance degradation in low Signal-to-Noise Ratio (SNR) environments. To address this challenge, this paper introduces a novel approach utilizing a Residual Network (ResNet) based Convolutional Neural Network (CNN) to enhance DoA estimation performance. Our method involves reconstructing the covariance matrix using the proposed CNN model, which is trained with covariance matrices from the actual received signals to their corresponding covariance matrices from the ideal signals. This reconstruction improves the accuracy of the covariance matrix, leading to enhanced DoA estimation and approaching the Cramer Rao Lower Bound (CRLB), especially at very low SNR levels. Simulation results demonstrate the effectiveness of our approach. At an SNR of −20dB, the root mean square error (RMSE) for MUSIC improved from 25.16° to 19.67°, for ESPRIT from 29.31° to 17.77°, and for R-MUSIC from 26.28° to 19.81°. The proposed method also shows substantial improvements in performance with varying numbers of snapshots and under different angle separation conditions. These results highlight the enhanced noise resilience and accuracy of our method. |
|---|---|
| ISSN: | 2169-3536 |