Robust Low-Snapshot DOA Estimation for Sparse Arrays via a Hybrid Convolutional Graph Neural Network

We propose a hybrid Convolutional Graph Neural Network (C-GNN) for direction-of-arrival (DOA) estimation in sparse sensor arrays under low-snapshot conditions. The C-GNN architecture combines 1D convolutional layers for local spatial feature extraction with graph convolutional layers for global stru...

Full description

Saved in:
Bibliographic Details
Main Authors: Hongliang Zhu, Hongxi Zhao, Chunshan Bao, Yiran Shi, Wenchao He
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/15/4563
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a hybrid Convolutional Graph Neural Network (C-GNN) for direction-of-arrival (DOA) estimation in sparse sensor arrays under low-snapshot conditions. The C-GNN architecture combines 1D convolutional layers for local spatial feature extraction with graph convolutional layers for global structural learning, effectively capturing both fine-grained and long-range array dependencies. Leveraging the difference coarray technique, the sparse array is transformed into a virtual uniform linear array (VULA) to enrich the spatial sampling; real-valued covariance matrices derived from the array measurements are used as the network’s input features. A final multi-layer perceptron (MLP) regression module then maps the learned representations to continuous DOA angle estimates. This approach capitalizes on the increased degrees of freedom offered by the virtual array while inherently incorporating the array’s geometric relationships via graph-based learning. The proposed C-GNN demonstrates robust performance in noisy, low-data scenarios, reliably estimating source angles even with very limited snapshots. By focusing on methodological innovation rather than bespoke architectural tuning, the framework shows promise for data-efficient DOA estimation in challenging practical conditions.
ISSN:1424-8220