Dense skip-attention for convolutional networks
Abstract The attention mechanism plays a crucial role in enhancing model performance by guiding the model to focus on important features. However, existing attention mechanism methods primarily concentrate on learning attention features within individual modules while ignoring interactions among ove...
Saved in:
| Main Authors: | Wenjie Liu, Guoqing Wu, Han Wang, Fuji Ren |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-07-01
|
| Series: | Scientific Reports |
| Online Access: | https://doi.org/10.1038/s41598-025-09346-8 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Lightweight Pyramidal Multi-Scale Attention Residual Network for Plant Disease Recognition
by: Wenjie Liu, et al.
Published: (2025-01-01) -
Enhanced urban driving scene segmentation using modified UNet with residual convolutions and attention guided skip connections
by: Siddhant Arora, et al.
Published: (2025-08-01) -
DFF-ResNet: An Insect Pest Recognition Model Based on Residual Networks
by: Wenjie Liu, et al.
Published: (2020-12-01) -
Multi-convolutional neural network brain image denoising study based on feature distillation learning and dense residual attention
by: Huimin Qu, et al.
Published: (2025-03-01) -
A dense multi-pooling convolutional network for driving fatigue detection
by: Qing Han, et al.
Published: (2025-05-01)