Enhancing temporal learning in recurrent spiking networks for neuromorphic applications
Training Recurrent Spiking Neural Networks (RSNNs) with binary spikes for tasks of extended time scales presents a challenge due to the amplified vanishing gradient problem during back propagation through time. This paper introduces three crucial elements that significantly enhance the memory and ca...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IOP Publishing
2025-01-01
|
| Series: | Neuromorphic Computing and Engineering |
| Subjects: | |
| Online Access: | https://doi.org/10.1088/2634-4386/add293 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849323784848277504 |
|---|---|
| author | Ismael Balafrej Soufiyan Bahadi Jean Rouat Fabien Alibart |
| author_facet | Ismael Balafrej Soufiyan Bahadi Jean Rouat Fabien Alibart |
| author_sort | Ismael Balafrej |
| collection | DOAJ |
| description | Training Recurrent Spiking Neural Networks (RSNNs) with binary spikes for tasks of extended time scales presents a challenge due to the amplified vanishing gradient problem during back propagation through time. This paper introduces three crucial elements that significantly enhance the memory and capabilities of RSNNs, with a strong emphasis on compatibility with hardware and neuromorphic systems. Firstly, we incorporate neuron-level synaptic delays, which not only allow the gradient to skip time steps but also reduce the overall neuron population’s firing rate. Subsequently, we apply a biologically inspired branching factor regularization rule to stabilize the network’s dynamics and make training easier by incorporating a time-local error in the loss function. Lastly, we modify a commonly used surrogate gradient function by increasing its support to facilitate learning over longer timescales when using binary spikes. By integrating these three innovative elements, we not only resolve several complex benchmarks but also achieve state-of-the-art results on the spiking permuted sequential MNIST task (psMNIST), showcasing the practicality and relevance of our approach for digital and analog neuromorphic systems. |
| format | Article |
| id | doaj-art-01c4f1fa64dc43a7b6d2e596fcaff38e |
| institution | Kabale University |
| issn | 2634-4386 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IOP Publishing |
| record_format | Article |
| series | Neuromorphic Computing and Engineering |
| spelling | doaj-art-01c4f1fa64dc43a7b6d2e596fcaff38e2025-08-20T03:48:57ZengIOP PublishingNeuromorphic Computing and Engineering2634-43862025-01-015202400810.1088/2634-4386/add293Enhancing temporal learning in recurrent spiking networks for neuromorphic applicationsIsmael Balafrej0https://orcid.org/0000-0001-6730-0794Soufiyan Bahadi1https://orcid.org/0000-0003-0216-0457Jean Rouat2https://orcid.org/0000-0002-9306-426XFabien Alibart3https://orcid.org/0000-0002-9591-220XNECOTIS Research Lab, Université de Sherbrooke , Sherbrooke, J1K 2R1, Canada; Institut Interdisciplinaire d’Innovation Technologique (3IT) , Université de Sherbrooke, 3000 Boulevard de l’université, Sherbrooke, J1K OA5 Québec, CanadaNECOTIS Research Lab, Université de Sherbrooke , Sherbrooke, J1K 2R1, Canada; Institut Interdisciplinaire d’Innovation Technologique (3IT) , Université de Sherbrooke, 3000 Boulevard de l’université, Sherbrooke, J1K OA5 Québec, CanadaNECOTIS Research Lab, Université de Sherbrooke , Sherbrooke, J1K 2R1, Canada; Institut Interdisciplinaire d’Innovation Technologique (3IT) , Université de Sherbrooke, 3000 Boulevard de l’université, Sherbrooke, J1K OA5 Québec, CanadaInstitut Interdisciplinaire d’Innovation Technologique (3IT) , Université de Sherbrooke, 3000 Boulevard de l’université, Sherbrooke, J1K OA5 Québec, Canada; Laboratoire Nanotechnologies Nanosystèmes (LN2)-IRL3463, CNRS, Université de Sherbrooke , INSA Lyon, École Centrale de Lyon, Université Grenoble Alpes, Sherbrooke, J1K 0A5 Québec, CanadaTraining Recurrent Spiking Neural Networks (RSNNs) with binary spikes for tasks of extended time scales presents a challenge due to the amplified vanishing gradient problem during back propagation through time. This paper introduces three crucial elements that significantly enhance the memory and capabilities of RSNNs, with a strong emphasis on compatibility with hardware and neuromorphic systems. Firstly, we incorporate neuron-level synaptic delays, which not only allow the gradient to skip time steps but also reduce the overall neuron population’s firing rate. Subsequently, we apply a biologically inspired branching factor regularization rule to stabilize the network’s dynamics and make training easier by incorporating a time-local error in the loss function. Lastly, we modify a commonly used surrogate gradient function by increasing its support to facilitate learning over longer timescales when using binary spikes. By integrating these three innovative elements, we not only resolve several complex benchmarks but also achieve state-of-the-art results on the spiking permuted sequential MNIST task (psMNIST), showcasing the practicality and relevance of our approach for digital and analog neuromorphic systems.https://doi.org/10.1088/2634-4386/add293spiking neural networksynaptic delayssurrogate gradient descentneural dynamics |
| spellingShingle | Ismael Balafrej Soufiyan Bahadi Jean Rouat Fabien Alibart Enhancing temporal learning in recurrent spiking networks for neuromorphic applications Neuromorphic Computing and Engineering spiking neural network synaptic delays surrogate gradient descent neural dynamics |
| title | Enhancing temporal learning in recurrent spiking networks for neuromorphic applications |
| title_full | Enhancing temporal learning in recurrent spiking networks for neuromorphic applications |
| title_fullStr | Enhancing temporal learning in recurrent spiking networks for neuromorphic applications |
| title_full_unstemmed | Enhancing temporal learning in recurrent spiking networks for neuromorphic applications |
| title_short | Enhancing temporal learning in recurrent spiking networks for neuromorphic applications |
| title_sort | enhancing temporal learning in recurrent spiking networks for neuromorphic applications |
| topic | spiking neural network synaptic delays surrogate gradient descent neural dynamics |
| url | https://doi.org/10.1088/2634-4386/add293 |
| work_keys_str_mv | AT ismaelbalafrej enhancingtemporallearninginrecurrentspikingnetworksforneuromorphicapplications AT soufiyanbahadi enhancingtemporallearninginrecurrentspikingnetworksforneuromorphicapplications AT jeanrouat enhancingtemporallearninginrecurrentspikingnetworksforneuromorphicapplications AT fabienalibart enhancingtemporallearninginrecurrentspikingnetworksforneuromorphicapplications |