Dense skip-attention for convolutional networks

Abstract The attention mechanism plays a crucial role in enhancing model performance by guiding the model to focus on important features. However, existing attention mechanism methods primarily concentrate on learning attention features within individual modules while ignoring interactions among ove...

Full description

Saved in:
Bibliographic Details
Main Authors: Wenjie Liu, Guoqing Wu, Han Wang, Fuji Ren
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-09346-8
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849335223013081088
author Wenjie Liu
Guoqing Wu
Han Wang
Fuji Ren
author_facet Wenjie Liu
Guoqing Wu
Han Wang
Fuji Ren
author_sort Wenjie Liu
collection DOAJ
description Abstract The attention mechanism plays a crucial role in enhancing model performance by guiding the model to focus on important features. However, existing attention mechanism methods primarily concentrate on learning attention features within individual modules while ignoring interactions among overall attention features. To overcome this limitation, we propose a dense skip-attention method for convolutional networks - a simple but effective approach to boost performance. Our method establishes dense skip-attention connections that interconnect all attention modules, forcing the model to learn interactive attention features within the network architecture. We conduct extensive experiments on the ImageNet 2012 and Microsoft COCO (MS COCO) 2017 datasets to validate the effectiveness of our approach. The experimental results demonstrate that our method improves the performance of existing attention mechanism methods, such as Squeeze-and-Excitation Networks, Efficient Channel Attention Networks and Convolutional Block Attention Module, in tasks like image classification, object detection, and instance segmentation. Notably, it achieves these improvements without significantly increasing model parameters or computational cost, maintaining minimal impact on both aspects.
format Article
id doaj-art-8f60a650967740eea104c4d8a4cc0900
institution Kabale University
issn 2045-2322
language English
publishDate 2025-07-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-8f60a650967740eea104c4d8a4cc09002025-08-20T03:45:20ZengNature PortfolioScientific Reports2045-23222025-07-011511910.1038/s41598-025-09346-8Dense skip-attention for convolutional networksWenjie Liu0Guoqing Wu1Han Wang2Fuji Ren3School of Transportation and Civil Engineering, Nantong UniversityNantong Institute of TechnologySchool of Transportation and Civil Engineering, Nantong UniversityUniversity of Electronic Science and Technology of ChinaAbstract The attention mechanism plays a crucial role in enhancing model performance by guiding the model to focus on important features. However, existing attention mechanism methods primarily concentrate on learning attention features within individual modules while ignoring interactions among overall attention features. To overcome this limitation, we propose a dense skip-attention method for convolutional networks - a simple but effective approach to boost performance. Our method establishes dense skip-attention connections that interconnect all attention modules, forcing the model to learn interactive attention features within the network architecture. We conduct extensive experiments on the ImageNet 2012 and Microsoft COCO (MS COCO) 2017 datasets to validate the effectiveness of our approach. The experimental results demonstrate that our method improves the performance of existing attention mechanism methods, such as Squeeze-and-Excitation Networks, Efficient Channel Attention Networks and Convolutional Block Attention Module, in tasks like image classification, object detection, and instance segmentation. Notably, it achieves these improvements without significantly increasing model parameters or computational cost, maintaining minimal impact on both aspects.https://doi.org/10.1038/s41598-025-09346-8
spellingShingle Wenjie Liu
Guoqing Wu
Han Wang
Fuji Ren
Dense skip-attention for convolutional networks
Scientific Reports
title Dense skip-attention for convolutional networks
title_full Dense skip-attention for convolutional networks
title_fullStr Dense skip-attention for convolutional networks
title_full_unstemmed Dense skip-attention for convolutional networks
title_short Dense skip-attention for convolutional networks
title_sort dense skip attention for convolutional networks
url https://doi.org/10.1038/s41598-025-09346-8
work_keys_str_mv AT wenjieliu denseskipattentionforconvolutionalnetworks
AT guoqingwu denseskipattentionforconvolutionalnetworks
AT hanwang denseskipattentionforconvolutionalnetworks
AT fujiren denseskipattentionforconvolutionalnetworks