Graph-Based Few-Shot Learning for Synthetic Aperture Radar Automatic Target Recognition with Alternating Direction Method of Multipliers
Synthetic aperture radar (SAR) automatic target recognition (ATR) underpins various remote sensing tasks, such as defense surveillance, environmental monitoring, and disaster management. However, the scarcity of annotated SAR data significantly limits the performance of conventional data-driven meth...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-03-01
|
| Series: | Remote Sensing |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2072-4292/17/7/1179 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Synthetic aperture radar (SAR) automatic target recognition (ATR) underpins various remote sensing tasks, such as defense surveillance, environmental monitoring, and disaster management. However, the scarcity of annotated SAR data significantly limits the performance of conventional data-driven methods. To address this challenge, we propose a novel few-shot learning (FSL) framework: the alternating direction method of multipliers–graph convolutional network (ADMM-GCN) framework. ADMM-GCN integrates a GCN with ADMM to enhance SAR ATR under limited data conditions, effectively capturing both global and local structural information from SAR samples. Additionally, it leverages a mixed regularized loss to mitigate overfitting and employs an ADMM-based optimization strategy to improve training efficiency and model stability. Extensive experiments conducted on the Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset demonstrate the superiority of ADMM-GCN, achieving an impressive accuracy of 92.18% on the challenging three-way 10-shot task and outperforming the benchmarks by 3.25%. Beyond SAR ATR, the proposed approach also advances FSL for real-world applications in remote sensing and geospatial analysis, where learning from scarce data is essential. |
|---|---|
| ISSN: | 2072-4292 |