Laor Initialization: A New Weight Initialization Method for the Backpropagation of Deep Learning

This paper presents Laor Initialization, an innovative weight initialization technique for deep neural networks that utilizes forward-pass error feedback in conjunction with k-means clustering to optimize the initial weights. In contrast to traditional methods, Laor adopts a data-driven approach tha...

Full description

Saved in:
Bibliographic Details
Main Authors: Laor Boongasame, Jirapond Muangprathub, Karanrat Thammarak
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Big Data and Cognitive Computing
Subjects:
Online Access:https://www.mdpi.com/2504-2289/9/7/181
Tags: Add Tag
No Tags, Be the first to tag this record!