Modeling of whole brain sleep electroencephalogram using deep oscillatory neural network

This study presents a general trainable network of Hopf oscillators to model high-dimensional electroencephalogram (EEG) signals across different sleep stages. The proposed architecture consists of two main components: a layer of interconnected oscillators and a complex-valued feed-forward network d...

Full description

Saved in:
Bibliographic Details
Main Authors: Sayan Ghosh, Dipayan Biswas, N. R. Rohan, Sujith Vijayan, V. Srinivasa Chakravarthy
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-05-01
Series:Frontiers in Neuroinformatics
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fninf.2025.1513374/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study presents a general trainable network of Hopf oscillators to model high-dimensional electroencephalogram (EEG) signals across different sleep stages. The proposed architecture consists of two main components: a layer of interconnected oscillators and a complex-valued feed-forward network designed with and without a hidden layer. Incorporating a hidden layer in the feed-forward network leads to lower reconstruction errors than the simpler version without it. Our model reconstructs EEG signals across all five sleep stages and predicts the subsequent 5 s of EEG activity. The predicted data closely aligns with the empirical EEG regarding mean absolute error, power spectral similarity, and complexity measures. We propose three models, each representing a stage of increasing complexity from initial training to architectures with and without hidden layers. In these models, the oscillators initially lack spatial localization. However, we introduce spatial constraints in the final two models by superimposing spherical shells and rectangular geometries onto the oscillator network. Overall, the proposed model represents a step toward constructing a large-scale, biologically inspired model of brain dynamics.
ISSN:1662-5196