Theoretical understanding of gradients of spike functions as boolean functions

Abstract Applying an error-backpropagation algorithm to spiking neural networks frequently needs to employ fictive derivatives of spike functions (popularly referred to as surrogate gradients) because the spike function is considered non-differentiable. The non-differentiability comes into play give...

Full description

Saved in:
Bibliographic Details
Main Authors: DongHyung Yoo, Doo Seok Jeong
Format: Article
Language:English
Published: Springer 2024-11-01
Series:Complex & Intelligent Systems
Subjects:
Online Access:https://doi.org/10.1007/s40747-024-01607-9
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832571179831918592
author DongHyung Yoo
Doo Seok Jeong
author_facet DongHyung Yoo
Doo Seok Jeong
author_sort DongHyung Yoo
collection DOAJ
description Abstract Applying an error-backpropagation algorithm to spiking neural networks frequently needs to employ fictive derivatives of spike functions (popularly referred to as surrogate gradients) because the spike function is considered non-differentiable. The non-differentiability comes into play given that the spike function is viewed as a numeric function, most popularly, the Heaviside step function of membrane potential. To get back to basics, the spike function is not a numeric but a Boolean function that outputs True or False upon the comparison of the current potential and threshold. In this regard, we propose a method to evaluate the gradient of spike function viewed as a Boolean function for fixed- and floating-point data formats. For both formats, the gradient is considerably similar to a delta function that peaks at the threshold for spiking, which justifies the approximation of the spike function to the Heaviside step function. Unfortunately, the error-backpropagation algorithm with this gradient function fails to outperform popularly employed surrogate gradients, which may arise from the narrow peak of the gradient function and consequent potential undershoot and overshoot around the spiking threshold with coarse timesteps. We provide theoretical grounds of this hypothesis.
format Article
id doaj-art-86d30ee00ace4cb7886edc7008fb2a6f
institution Kabale University
issn 2199-4536
2198-6053
language English
publishDate 2024-11-01
publisher Springer
record_format Article
series Complex & Intelligent Systems
spelling doaj-art-86d30ee00ace4cb7886edc7008fb2a6f2025-02-02T12:48:55ZengSpringerComplex & Intelligent Systems2199-45362198-60532024-11-0111111710.1007/s40747-024-01607-9Theoretical understanding of gradients of spike functions as boolean functionsDongHyung Yoo0Doo Seok Jeong1Division of Materials Science and Engineering, Hanyang UniversityDivision of Materials Science and Engineering, Hanyang UniversityAbstract Applying an error-backpropagation algorithm to spiking neural networks frequently needs to employ fictive derivatives of spike functions (popularly referred to as surrogate gradients) because the spike function is considered non-differentiable. The non-differentiability comes into play given that the spike function is viewed as a numeric function, most popularly, the Heaviside step function of membrane potential. To get back to basics, the spike function is not a numeric but a Boolean function that outputs True or False upon the comparison of the current potential and threshold. In this regard, we propose a method to evaluate the gradient of spike function viewed as a Boolean function for fixed- and floating-point data formats. For both formats, the gradient is considerably similar to a delta function that peaks at the threshold for spiking, which justifies the approximation of the spike function to the Heaviside step function. Unfortunately, the error-backpropagation algorithm with this gradient function fails to outperform popularly employed surrogate gradients, which may arise from the narrow peak of the gradient function and consequent potential undershoot and overshoot around the spiking threshold with coarse timesteps. We provide theoretical grounds of this hypothesis.https://doi.org/10.1007/s40747-024-01607-9Spike functionGradients of spike functionsSpiking neural networksBoolean differentiation
spellingShingle DongHyung Yoo
Doo Seok Jeong
Theoretical understanding of gradients of spike functions as boolean functions
Complex & Intelligent Systems
Spike function
Gradients of spike functions
Spiking neural networks
Boolean differentiation
title Theoretical understanding of gradients of spike functions as boolean functions
title_full Theoretical understanding of gradients of spike functions as boolean functions
title_fullStr Theoretical understanding of gradients of spike functions as boolean functions
title_full_unstemmed Theoretical understanding of gradients of spike functions as boolean functions
title_short Theoretical understanding of gradients of spike functions as boolean functions
title_sort theoretical understanding of gradients of spike functions as boolean functions
topic Spike function
Gradients of spike functions
Spiking neural networks
Boolean differentiation
url https://doi.org/10.1007/s40747-024-01607-9
work_keys_str_mv AT donghyungyoo theoreticalunderstandingofgradientsofspikefunctionsasbooleanfunctions
AT dooseokjeong theoreticalunderstandingofgradientsofspikefunctionsasbooleanfunctions