BN-SNN: Spiking neural networks with bistable neurons for object detection.

Spiking neural networks (SNNs) are emerging as a promising evolution in neural network paradigms, offering an alternative to conventional convolutional neural networks (CNNs). One of the most effective methods for SNN development is the CNN-to-SNN conversion process. However, existing conversion tec...

Full description

Saved in:
Bibliographic Details
Main Authors: Siddiqui Muhammad Yasir, Hyun Kim
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0327513
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Spiking neural networks (SNNs) are emerging as a promising evolution in neural network paradigms, offering an alternative to conventional convolutional neural networks (CNNs). One of the most effective methods for SNN development is the CNN-to-SNN conversion process. However, existing conversion techniques are hindered by long temporal durations or inference latencies, which negatively impact the accuracy of the converted networks. Additionally, the application of SNNs in object detection tasks remains largely under-explored. In this study, we propose a novel approach utilizing a bistable integrate-and-fire (BIF) neuron model integrated with a single-shot multibox detector (SSD) as the detection head. Leveraging the proposed BIF neuron framework, we convert the widely used ResNet architecture into an SNN. We validate the effectiveness of our approach through object detection tasks on the MS-COCO and Automotive GEN1 datasets. Experimental results show that our conversion technique facilitates object detection with reduced temporal steps and significant enhancements in mean average precision (mAP), achieving mAP@0.5 scores of 0.476 and 0.591 for the MS-COCO and Automotive GEN1 datasets, respectively. This research marks the first application of BIF neurons to object detection, presenting a novel advancement in the field.
ISSN:1932-6203