Improving neural network training using dynamic learning rate schedule for PINNs and image classification

Training neural networks can be challenging, especially as the complexity of the problem increases. Despite using wider or deeper networks, training them can be a tedious process, especially if a wrong choice of the hyperparameter is made. The learning rate is one of such crucial hyperparameters, wh...

Full description

Saved in:
Bibliographic Details
Main Authors: Veerababu Dharanalakota, Ashwin Arvind Raikar, Prasanta Kumar Ghosh
Format: Article
Language:English
Published: Elsevier 2025-09-01
Series:Machine Learning with Applications
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2666827025000805
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849729833394765824
author Veerababu Dharanalakota
Ashwin Arvind Raikar
Prasanta Kumar Ghosh
author_facet Veerababu Dharanalakota
Ashwin Arvind Raikar
Prasanta Kumar Ghosh
author_sort Veerababu Dharanalakota
collection DOAJ
description Training neural networks can be challenging, especially as the complexity of the problem increases. Despite using wider or deeper networks, training them can be a tedious process, especially if a wrong choice of the hyperparameter is made. The learning rate is one of such crucial hyperparameters, which is usually kept static during the training process. Learning dynamics in complex systems often requires a more adaptive approach to the learning rate. This adaptability becomes crucial to effectively navigate varying gradients and optimize the learning process during the training process. In this paper, a dynamic learning rate scheduler (DLRS) algorithm is presented that adapts the learning rate based on the loss values calculated during the training process. Experiments are conducted on problems related to physics-informed neural networks (PINNs) and image classification using multilayer perceptrons and convolutional neural networks, respectively. The results demonstrate that the proposed DLRS accelerates training and improves stability.
format Article
id doaj-art-be1e25e70dbd4eb98ec2991c91ad6784
institution DOAJ
issn 2666-8270
language English
publishDate 2025-09-01
publisher Elsevier
record_format Article
series Machine Learning with Applications
spelling doaj-art-be1e25e70dbd4eb98ec2991c91ad67842025-08-20T03:09:04ZengElsevierMachine Learning with Applications2666-82702025-09-012110069710.1016/j.mlwa.2025.100697Improving neural network training using dynamic learning rate schedule for PINNs and image classificationVeerababu Dharanalakota0Ashwin Arvind Raikar1Prasanta Kumar Ghosh2Department of Electrical Engineering, Indian Institute of Science, CV Raman Road, Bengaluru, 560012, Karnataka, IndiaDepartment of Computer Science, Purdue University, Fort Wayne, IN 46805, USADepartment of Electrical Engineering, Indian Institute of Science, CV Raman Road, Bengaluru, 560012, Karnataka, India; Corresponding author.Training neural networks can be challenging, especially as the complexity of the problem increases. Despite using wider or deeper networks, training them can be a tedious process, especially if a wrong choice of the hyperparameter is made. The learning rate is one of such crucial hyperparameters, which is usually kept static during the training process. Learning dynamics in complex systems often requires a more adaptive approach to the learning rate. This adaptability becomes crucial to effectively navigate varying gradients and optimize the learning process during the training process. In this paper, a dynamic learning rate scheduler (DLRS) algorithm is presented that adapts the learning rate based on the loss values calculated during the training process. Experiments are conducted on problems related to physics-informed neural networks (PINNs) and image classification using multilayer perceptrons and convolutional neural networks, respectively. The results demonstrate that the proposed DLRS accelerates training and improves stability.http://www.sciencedirect.com/science/article/pii/S2666827025000805Adaptive learningMultilayer perceptronCNNMNISTCIFAR-10
spellingShingle Veerababu Dharanalakota
Ashwin Arvind Raikar
Prasanta Kumar Ghosh
Improving neural network training using dynamic learning rate schedule for PINNs and image classification
Machine Learning with Applications
Adaptive learning
Multilayer perceptron
CNN
MNIST
CIFAR-10
title Improving neural network training using dynamic learning rate schedule for PINNs and image classification
title_full Improving neural network training using dynamic learning rate schedule for PINNs and image classification
title_fullStr Improving neural network training using dynamic learning rate schedule for PINNs and image classification
title_full_unstemmed Improving neural network training using dynamic learning rate schedule for PINNs and image classification
title_short Improving neural network training using dynamic learning rate schedule for PINNs and image classification
title_sort improving neural network training using dynamic learning rate schedule for pinns and image classification
topic Adaptive learning
Multilayer perceptron
CNN
MNIST
CIFAR-10
url http://www.sciencedirect.com/science/article/pii/S2666827025000805
work_keys_str_mv AT veerababudharanalakota improvingneuralnetworktrainingusingdynamiclearningratescheduleforpinnsandimageclassification
AT ashwinarvindraikar improvingneuralnetworktrainingusingdynamiclearningratescheduleforpinnsandimageclassification
AT prasantakumarghosh improvingneuralnetworktrainingusingdynamiclearningratescheduleforpinnsandimageclassification