A lightweight high-frequency mamba network for image super-resolution
Abstract After continuous development, many researchers are exploring how to better utilize global and local information in single image super-resolution (SISR). Various methods based on convolutional neural network (CNN) and Transformer structures have emerged, but few studies have mentioned how to...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-07-01
|
| Series: | Scientific Reports |
| Subjects: | |
| Online Access: | https://doi.org/10.1038/s41598-025-11663-x |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract After continuous development, many researchers are exploring how to better utilize global and local information in single image super-resolution (SISR). Various methods based on convolutional neural network (CNN) and Transformer structures have emerged, but few studies have mentioned how to combine these two parts of information. We study the use of self-attention mechanism to integrate local and global information, aiming to make the model better balance the weights of the two parts of information. At the same time, in order to avoid the huge amount of computation brought by Transformer, we use the selective state space model VMamba to extract global information to achieve the effect of reducing computational complexity and lightweight network. Based on the above situation, we propose a High-frequency Mamba Network (HFMN) for SISR, which includes the local high-frequency extraction module Local High-Frequency Feature Block (LHFB), the global feature extraction module Mamba-Based Attention Block (MAB) based on VMamba, and the dual attention fusion module Dual-information Interactive Attention Block (DIAB). It can better incorporate local and global information and has linear complexity in the global feature extraction branch. Experiments on multiple benchmark datasets demonstrate that the network outforms recent SOTA methods in SISR while using fewer parameters. All codes are available at https://github.com/taoWuuu/HFMN . |
|---|---|
| ISSN: | 2045-2322 |