An Improved Stochastic Configuration Networks With Compact Structure and Parameter Adaptation

Stochastic Configuration Networks (SCNs) perform well in machine learning and data mining tasks in complex data environments. However, traditional SCNs have limitations in network size and computation time. To address these issues, this paper proposes an improved version of SCNs. There are two key i...

Full description

Saved in:
Bibliographic Details
Main Authors: Sanyi Li, Hongyu Guan, Peng Liu, Weichao Yue, Qian Wang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10852165/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Stochastic Configuration Networks (SCNs) perform well in machine learning and data mining tasks in complex data environments. However, traditional SCNs have limitations in network size and computation time. To address these issues, this paper proposes an improved version of SCNs. There are two key improvements: First, the stopping condition for generating neurons is optimized to improve the effectiveness of new neurons. Second, the regularization parameter r is adjusted dynamically to speed up the learning process. These improvements are trying to increase the efficiency of SCN construction, reduce the number of redundant neurons, and shorten the overall computation time. Experiments comparing this method with existing ones show that the proposed approach not only reduces network complexity but also effectively decreases training time. In addition, experimental results using baseline datasets and UCI databases show that the number of nodes required for oscn is reduced by approximately 50% and the computation time is reduced by approximately 40% compared to traditional algorithms.
ISSN:2169-3536