Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image Classification
Recently, transformer has gained widespread application in hyperspectral image classification (HSIC) tasks due to its powerful global modeling ability. However, the inherent high-dimensional property of hyperspectral images (HSIs) leads to a sharp increase in the number of parameters and expensive c...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10964176/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849726451757088768 |
|---|---|
| author | Yuyang Wang Zhenqiu Shu Zhengtao Yu |
| author_facet | Yuyang Wang Zhenqiu Shu Zhengtao Yu |
| author_sort | Yuyang Wang |
| collection | DOAJ |
| description | Recently, transformer has gained widespread application in hyperspectral image classification (HSIC) tasks due to its powerful global modeling ability. However, the inherent high-dimensional property of hyperspectral images (HSIs) leads to a sharp increase in the number of parameters and expensive computational costs. Moreover, self-attention operations in transformer-based HSIC methods may introduce irrelevant spectral–spatial information, and thus may consequently impact the classification performance. To mitigate these issues, in this article, we introduce an efficient deep network, named efficient attention transformer network (EATN), for practice HSIC tasks. Specifically, we propose two self-similarity descriptors based on the original HSI patch to enhance spatial feature representations. The center self-similarity descriptor emphasizes pixels similar to the central pixel. In contrast, the neighborhood self-similarity descriptor explores the similarity relationship between each pixel and its neighboring pixels within the patch. Then, we embed these two self-similarity descriptors into the original patch for subsequent feature extraction and classification. Furthermore, we design two efficient feature extraction modules based on the preprocessed patches, called spectral interactive transformer module and spatial conv-attention module, to reduce the computational costs of the classification framework. Extensive experiments on four benchmark datasets show that our proposed EATN method outperforms other state-of-the-art HSI classification approaches. |
| format | Article |
| id | doaj-art-3003f71cf2bc4ecb94eedf294deb2d6e |
| institution | DOAJ |
| issn | 1939-1404 2151-1535 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing |
| spelling | doaj-art-3003f71cf2bc4ecb94eedf294deb2d6e2025-08-20T03:10:10ZengIEEEIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing1939-14042151-15352025-01-0118114691148610.1109/JSTARS.2025.356038410964176Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image ClassificationYuyang Wang0https://orcid.org/0009-0002-1527-4517Zhenqiu Shu1https://orcid.org/0000-0001-5737-3383Zhengtao Yu2https://orcid.org/0000-0001-8952-8984Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming, ChinaFaculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming, ChinaFaculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming, ChinaRecently, transformer has gained widespread application in hyperspectral image classification (HSIC) tasks due to its powerful global modeling ability. However, the inherent high-dimensional property of hyperspectral images (HSIs) leads to a sharp increase in the number of parameters and expensive computational costs. Moreover, self-attention operations in transformer-based HSIC methods may introduce irrelevant spectral–spatial information, and thus may consequently impact the classification performance. To mitigate these issues, in this article, we introduce an efficient deep network, named efficient attention transformer network (EATN), for practice HSIC tasks. Specifically, we propose two self-similarity descriptors based on the original HSI patch to enhance spatial feature representations. The center self-similarity descriptor emphasizes pixels similar to the central pixel. In contrast, the neighborhood self-similarity descriptor explores the similarity relationship between each pixel and its neighboring pixels within the patch. Then, we embed these two self-similarity descriptors into the original patch for subsequent feature extraction and classification. Furthermore, we design two efficient feature extraction modules based on the preprocessed patches, called spectral interactive transformer module and spatial conv-attention module, to reduce the computational costs of the classification framework. Extensive experiments on four benchmark datasets show that our proposed EATN method outperforms other state-of-the-art HSI classification approaches.https://ieeexplore.ieee.org/document/10964176/Attentionhyperspectral image classification (HSIC)self-similarityspectral interactivetransformer |
| spellingShingle | Yuyang Wang Zhenqiu Shu Zhengtao Yu Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image Classification IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing Attention hyperspectral image classification (HSIC) self-similarity spectral interactive transformer |
| title | Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image Classification |
| title_full | Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image Classification |
| title_fullStr | Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image Classification |
| title_full_unstemmed | Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image Classification |
| title_short | Efficient Attention Transformer Network With Self-Similarity Feature Enhancement for Hyperspectral Image Classification |
| title_sort | efficient attention transformer network with self similarity feature enhancement for hyperspectral image classification |
| topic | Attention hyperspectral image classification (HSIC) self-similarity spectral interactive transformer |
| url | https://ieeexplore.ieee.org/document/10964176/ |
| work_keys_str_mv | AT yuyangwang efficientattentiontransformernetworkwithselfsimilarityfeatureenhancementforhyperspectralimageclassification AT zhenqiushu efficientattentiontransformernetworkwithselfsimilarityfeatureenhancementforhyperspectralimageclassification AT zhengtaoyu efficientattentiontransformernetworkwithselfsimilarityfeatureenhancementforhyperspectralimageclassification |