Blessing of dimensionality in spiking neural networks: the by-chance functional learning

Spiking neural networks (SNNs) have significant potential for a power-efficient neuromorphic AI. However, their training is challenging since most of the learning principles known from artificial neural networks are hardly applicable. Recently, the concept of “blessing of dimensionality” has success...

Full description

Saved in:
Bibliographic Details
Main Authors: Valeri A. Makarov, Sergey A. Lobov
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-06-01
Series:Frontiers in Applied Mathematics and Statistics
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fams.2025.1553779/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Spiking neural networks (SNNs) have significant potential for a power-efficient neuromorphic AI. However, their training is challenging since most of the learning principles known from artificial neural networks are hardly applicable. Recently, the concept of “blessing of dimensionality” has successfully been used to treat high-dimensional data and representations of reality. It exploits the fundamental trade-off between the complexity and simplicity of statistical sets in high-dimensional spaces without relying on global optimization techniques. We show that the frequency encoding of memories in SNNs can leverage this paradigm. It enables detecting and learning arbitrary information items, given that they operate in high dimensions. To illustrate the hypothesis, we develop a minimalist model of information processing in layered brain structures and study the emergence of extreme selectivity to multiple stimuli and associative memories. Our results suggest that global optimization of cost functions may be circumvented at different levels of information processing in SNNs, and replaced by chance learning, greatly simplifying the design of AI devices.
ISSN:2297-4687