Efficient Image Super-Resolution with Multi-Branch Mixer Transformer
Deep learning methods have demonstrated significant advancements in single image super-resolution (SISR), with Transformer-based models frequently outperforming CNN-based counterparts in performance. However, due to the self-attention mechanism in Transformers, achieving lightweight models remains...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Slovenian Society for Stereology and Quantitative Image Analysis
2025-02-01
|
Series: | Image Analysis and Stereology |
Subjects: | |
Online Access: | https://www.ias-iss.org/ojs/IAS/article/view/3399 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Deep learning methods have demonstrated significant advancements in single image super-resolution (SISR), with Transformer-based models frequently outperforming CNN-based counterparts in performance. However, due to the self-attention mechanism in Transformers, achieving lightweight models remains challenging compared to CNN-based approaches. In this paper, we propose a lightweight Transformer model termed Multi-Branch Mixer Transformer (MBMT) for SR. The design of MBMT is motivated by two main considerations: while self-attention excels at capturing long-range dependencies in features, it struggles with extracting local features. Secondly, the quadratic complexity of self-attention forms a significant challenge in building lightweight models. To address these problems, we propose a Multi-Branch Token Mixer (MBTM) to extract richer global and local information. Specifically, MBTM consists of three parts: shifted window attention, depthwise convolution, and active token mixer. This multi-branch structure handles both long-range dependencies and local features simultaneously, enabling us to achieve excellent SR performance with just a few stacked modules. Experimental results demonstrate that MBTM achieves competitive performance while maintaining model efficiency compared to SOTA methods.
|
---|---|
ISSN: | 1580-3139 1854-5165 |