Dense skip-attention for convolutional networks

Abstract The attention mechanism plays a crucial role in enhancing model performance by guiding the model to focus on important features. However, existing attention mechanism methods primarily concentrate on learning attention features within individual modules while ignoring interactions among ove...

Full description

Saved in:
Bibliographic Details
Main Authors: Wenjie Liu, Guoqing Wu, Han Wang, Fuji Ren
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-09346-8
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract The attention mechanism plays a crucial role in enhancing model performance by guiding the model to focus on important features. However, existing attention mechanism methods primarily concentrate on learning attention features within individual modules while ignoring interactions among overall attention features. To overcome this limitation, we propose a dense skip-attention method for convolutional networks - a simple but effective approach to boost performance. Our method establishes dense skip-attention connections that interconnect all attention modules, forcing the model to learn interactive attention features within the network architecture. We conduct extensive experiments on the ImageNet 2012 and Microsoft COCO (MS COCO) 2017 datasets to validate the effectiveness of our approach. The experimental results demonstrate that our method improves the performance of existing attention mechanism methods, such as Squeeze-and-Excitation Networks, Efficient Channel Attention Networks and Convolutional Block Attention Module, in tasks like image classification, object detection, and instance segmentation. Notably, it achieves these improvements without significantly increasing model parameters or computational cost, maintaining minimal impact on both aspects.
ISSN:2045-2322