A Robust Hybrid CNN–LSTM Model for Predicting Student Academic Performance

The rapid increase in educational data from diverse sources such as learning management systems and assessment records necessitates the application of advanced analytical techniques to identify at-risk students and address persistent issues like dropout rates and academic underperformance. However,...

Full description

Saved in:
Bibliographic Details
Main Authors: Kuburat Oyeranti Adefemi, Murimo Bethel Mutanga
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Digital
Subjects:
Online Access:https://www.mdpi.com/2673-6470/5/2/16
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The rapid increase in educational data from diverse sources such as learning management systems and assessment records necessitates the application of advanced analytical techniques to identify at-risk students and address persistent issues like dropout rates and academic underperformance. However, many existing models struggle with generalizability and fail to effectively manage data challenges such as class imbalance and missing data, leading to suboptimal predictive performance. This study proposes a hybrid deep learning model combining convolutional neural networks (CNN) and long short-term memory (LSTM) networks to improve the accuracy of student academic performance prediction and enable timely educational interventions. To improve the performance of the model, we incorporate feature selection techniques and optimization strategies to enhance reliability. We also address common preprocessing challenges such as missing data and data imbalance. The proposed model was evaluated on two benchmark datasets to ensure model generalization capability. The hybrid model achieved predictive accuracies of 98.93% and 98.82% on the two datasets, respectively, outperforming traditional machine learning models and standalone deep learning approaches across key performance metrics including accuracy, precision, recall, and F-score.
ISSN:2673-6470