Optimal features assisted multi-attention fusion for robust fire recognition in adverse conditions

Abstract Deep neural networks have significantly enhanced visual data-based fire detection systems. However, high false alarm rates, shallow-layered networks, and poor recognition in challenging environments continue to hinder their practical deployment. To address these limitations, we introduce th...

Full description

Saved in:
Bibliographic Details
Main Authors: Inam Ullah, Nada Alzaben, Yousef Ibrahim Daradkeh, Mi Young Lee
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-09713-5
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Deep neural networks have significantly enhanced visual data-based fire detection systems. However, high false alarm rates, shallow-layered networks, and poor recognition in challenging environments continue to hinder their practical deployment. To address these limitations, we introduce the Attention-Enhanced Fire Recognition Network (AEFRN). This novel progressive attention-over-attention framework achieves state-of-the-art (SOTA) performance while maintaining computational efficiency. Our approach introduces three key innovations: Firstly, Convolutional Self-Attention (CSA), integrating global self-attention with convolution through dynamic kernels and trainable filters for enhanced low-level fire feature processing. Secondly, Recursive Atrous Self-Attention (RASA) with optimized dilation rates, capturing comprehensive multi-scale contextual information through a recursive formulation with minimal parameter overhead. Thirdly, an enhanced Convolutional Block Attention Module (CBAM) with modified channel and spatial attention mechanisms for robust feature discrimination. We validate AEFRN’s interpretability using Grad-CAM visualization, demonstrating effective attention focus on fire-relevant regions. Comprehensive experimental evaluation on FD and BoWFire benchmark datasets shows AEFRN’s superiority over SOTA methods, achieving 99.11% accuracy on the FD dataset, and 97.98% accuracy on the BoWFire dataset. Extensive comparisons against twelve SOTA approaches confirm AEFRN’s effectiveness for fire detection in challenging scenarios while maintaining computational efficiency suitable for practical deployment.
ISSN:2045-2322