Multi-representation domain attentive contrastive learning based unsupervised automatic modulation recognition
Abstract The rapid advancement of B5G/6G and wireless technologies, combined with rising end-user numbers, has intensified radio spectrum congestion. Automatic modulation recognition, crucial for spectrum sensing in cognitive radio, traditionally relies on supervised methods requiring extensive labe...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-07-01
|
| Series: | Nature Communications |
| Online Access: | https://doi.org/10.1038/s41467-025-60921-z |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract The rapid advancement of B5G/6G and wireless technologies, combined with rising end-user numbers, has intensified radio spectrum congestion. Automatic modulation recognition, crucial for spectrum sensing in cognitive radio, traditionally relies on supervised methods requiring extensive labeled data. However, acquiring reliable labels is challenging. Here, we propose an unsupervised framework, Multi-Representation Domain Attentive Contrastive Learning, which extracts high-quality signal features from unlabeled data via cross-domain contrastive learning. Inter-domain and intra-domain contrastive mechanisms enhance mutual modulation feature extraction across domains while preserving source domain self-information. The domain attention module dynamically selects representation domains at the feature level, improving adaptability. The experiments through public datasets show that the proposed method outperforms existing modulation recognition methods and can be extended to accommodate various representation domains. This study bridges the gap between unsupervised and supervised learning for radio signals, advancing Internet of Things and cognitive radio development. |
|---|---|
| ISSN: | 2041-1723 |