Hybrid neural networks for continual learning inspired by corticohippocampal circuits

Abstract Current artificial systems suffer from catastrophic forgetting during continual learning, a limitation absent in biological systems. Biological mechanisms leverage the dual representation of specific and generalized memories within corticohippocampal circuits to facilitate lifelong learning...

Full description

Saved in:
Bibliographic Details
Main Authors: Qianqian Shi, Faqiang Liu, Hongyi Li, Guangyu Li, Luping Shi, Rong Zhao
Format: Article
Language:English
Published: Nature Portfolio 2025-02-01
Series:Nature Communications
Online Access:https://doi.org/10.1038/s41467-025-56405-9
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Current artificial systems suffer from catastrophic forgetting during continual learning, a limitation absent in biological systems. Biological mechanisms leverage the dual representation of specific and generalized memories within corticohippocampal circuits to facilitate lifelong learning. Inspired by this, we develop a corticohippocampal circuits-based hybrid neural network (CH-HNN) that emulates these dual representations, significantly mitigating catastrophic forgetting in both task-incremental and class-incremental learning scenarios. Our CH-HNNs incorporate artificial neural networks and spiking neural networks, leveraging prior knowledge to facilitate new concept learning through episode inference, and offering insights into the neural functions of both feedforward and feedback loops within corticohippocampal circuits. Crucially, CH-HNN operates as a task-agnostic system without increasing memory demands, demonstrating adaptability and robustness in real-world applications. Coupled with the low power consumption inherent to SNNs, our model represents the potential for energy-efficient, continual learning in dynamic environments.
ISSN:2041-1723