Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks

Artificial neural networks are limited in the number of patterns that they can store and accurately recall, with capacity constraints arising from factors such as network size, architectural structure, pattern sparsity, and pattern dissimilarity. Exceeding these limits leads to recall errors, eventu...

Full description

Saved in:
Bibliographic Details
Main Authors: Zane Z. Chou, Jean-Marie C. Bouteiller
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-08-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fncom.2025.1646810/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849225276933799936
author Zane Z. Chou
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
author_facet Zane Z. Chou
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
author_sort Zane Z. Chou
collection DOAJ
description Artificial neural networks are limited in the number of patterns that they can store and accurately recall, with capacity constraints arising from factors such as network size, architectural structure, pattern sparsity, and pattern dissimilarity. Exceeding these limits leads to recall errors, eventually leading to catastrophic forgetting, which is a major challenge in continual learning. In this study, we characterize the theoretical maximum memory capacity of single-layer feedforward networks as a function of these parameters. We derive analytical expressions for maximum theoretical memory capacity and introduce a grid-based construction and sub-sampling method for pattern generation that takes advantage of the full storage potential of the network. Our findings indicate that maximum capacity scales as (N/S)S, where N is the number of input/output units and S the pattern sparsity, under threshold constraints related to minimum pattern differentiability. Simulation results validate these theoretical predictions and show that the optimal pattern set can be constructed deterministically for any given network size and pattern sparsity, systematically outperforming random pattern generation in terms of storage capacity. This work offers a foundational framework for maximizing storage efficiency in neural network systems and supports the development of data-efficient, sustainable AI.
format Article
id doaj-art-b078be9c2f1a43549661368605f557bf
institution Kabale University
issn 1662-5188
language English
publishDate 2025-08-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Computational Neuroscience
spelling doaj-art-b078be9c2f1a43549661368605f557bf2025-08-25T05:25:26ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882025-08-011910.3389/fncom.2025.16468101646810Maximizing theoretical and practical storage capacity in single-layer feedforward neural networksZane Z. Chou0Jean-Marie C. Bouteiller1Jean-Marie C. Bouteiller2Jean-Marie C. Bouteiller3Jean-Marie C. Bouteiller4Department of Biomedical Engineering, Viterbi School of Engineering, University of Southern California, Los Angeles, CA, United StatesDepartment of Biomedical Engineering, Viterbi School of Engineering, University of Southern California, Los Angeles, CA, United StatesInstitute for Technology and Medical Systems (ITEMS), Keck School of Medicine, University of Southern California, Los Angeles, CA, United StatesCenter for Artificial Intelligence and Quantum Computing in System Brain Research (CLARA), Prague, CzechiaInternational Neurodegenerative Disorders Research Center (INDRC), Prague, CzechiaArtificial neural networks are limited in the number of patterns that they can store and accurately recall, with capacity constraints arising from factors such as network size, architectural structure, pattern sparsity, and pattern dissimilarity. Exceeding these limits leads to recall errors, eventually leading to catastrophic forgetting, which is a major challenge in continual learning. In this study, we characterize the theoretical maximum memory capacity of single-layer feedforward networks as a function of these parameters. We derive analytical expressions for maximum theoretical memory capacity and introduce a grid-based construction and sub-sampling method for pattern generation that takes advantage of the full storage potential of the network. Our findings indicate that maximum capacity scales as (N/S)S, where N is the number of input/output units and S the pattern sparsity, under threshold constraints related to minimum pattern differentiability. Simulation results validate these theoretical predictions and show that the optimal pattern set can be constructed deterministically for any given network size and pattern sparsity, systematically outperforming random pattern generation in terms of storage capacity. This work offers a foundational framework for maximizing storage efficiency in neural network systems and supports the development of data-efficient, sustainable AI.https://www.frontiersin.org/articles/10.3389/fncom.2025.1646810/fullneural networkmemory capacitydata-efficient AIsustainable AIconstructive algorithms
spellingShingle Zane Z. Chou
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
Jean-Marie C. Bouteiller
Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks
Frontiers in Computational Neuroscience
neural network
memory capacity
data-efficient AI
sustainable AI
constructive algorithms
title Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks
title_full Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks
title_fullStr Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks
title_full_unstemmed Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks
title_short Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks
title_sort maximizing theoretical and practical storage capacity in single layer feedforward neural networks
topic neural network
memory capacity
data-efficient AI
sustainable AI
constructive algorithms
url https://www.frontiersin.org/articles/10.3389/fncom.2025.1646810/full
work_keys_str_mv AT zanezchou maximizingtheoreticalandpracticalstoragecapacityinsinglelayerfeedforwardneuralnetworks
AT jeanmariecbouteiller maximizingtheoreticalandpracticalstoragecapacityinsinglelayerfeedforwardneuralnetworks
AT jeanmariecbouteiller maximizingtheoreticalandpracticalstoragecapacityinsinglelayerfeedforwardneuralnetworks
AT jeanmariecbouteiller maximizingtheoreticalandpracticalstoragecapacityinsinglelayerfeedforwardneuralnetworks
AT jeanmariecbouteiller maximizingtheoreticalandpracticalstoragecapacityinsinglelayerfeedforwardneuralnetworks