xLSTMKT: xLSTM for Knowledge Tracing
Knowledge Tracing focuses on modeling a student’s evolving knowledge state to accurately predict their responses to future questions across a set of knowledge concepts. In recent years, KT models based on Long Short-Term Memory (LSTM) and Transformer architectures have achieved significan...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11108171/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Knowledge Tracing focuses on modeling a student’s evolving knowledge state to accurately predict their responses to future questions across a set of knowledge concepts. In recent years, KT models based on Long Short-Term Memory (LSTM) and Transformer architectures have achieved significant performance improvements, surpassing traditional Bayesian approaches. However, existing methods fuse question-specific and knowledge concepts-specific information at an early stage, limiting the model’s ability to fully capture the fine-grained details of student learning dynamics. To address these limitations, we propose a novel framework called xLSTM Knowledge Tracing (xLSTMKT), that explicitly separates student interactions into question-level and skill-level streams, generating dynamic embeddings for each at every time step. These embeddings are then jointly utilized to enhance next-response prediction. Extensive experiments conducted on several datasets demonstrate that our model consistently outperforms state-of-the-art KT methods, achieving at least a 1.42% improvement in AUC. Furthermore, we introduce a new problem formulation for the KT task, which yields additional performance gains, notably improving AUC scores on the ALGEBRA2005 dataset. |
|---|---|
| ISSN: | 2169-3536 |