IHML: Incremental Heuristic Meta-Learner
The landscape of machine learning constantly demands innovative approaches to enhance algorithms’ performance across diverse tasks. Meta-learning, known as “learning to learn” is a promising way to overcome these diversity challenges by blending multiple algorithms. This study introduces the IHML: I...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Taylor & Francis Group
2024-12-01
|
| Series: | Applied Artificial Intelligence |
| Online Access: | https://www.tandfonline.com/doi/10.1080/08839514.2024.2434309 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | The landscape of machine learning constantly demands innovative approaches to enhance algorithms’ performance across diverse tasks. Meta-learning, known as “learning to learn” is a promising way to overcome these diversity challenges by blending multiple algorithms. This study introduces the IHML: Incremental Heuristic Meta-Learner, a novel meta-learning algorithm for classification tasks. By leveraging a variety of base-learners with distinct learning dynamics, such as Gaussian, tree, and instance, IHML offers a comprehensive solution adaptable to different data characteristics. Moreover, the core contributions of IHML lie in its ability to tackle the optimal base-learner and feature sets determination mechanism with the help of Explainable Artificial Intelligence (XAI) and heuristic elbow methods. Existing work in this context utilizes XAI mostly in pre-processing the data or post-analysis of the results, however, IHML incorporates XAI into the learning process in an iterative manner and improves the prediction performance of the meta-learner. To observe the performance of the proposed IHML, we used five different datasets from astrophysics, physics, biology, e-commerce, and economics. The results show that the proposed model achieves more accuracy (in average % 10 and at most % 71 improvements) compared to the baseline machine learning models in the literature. |
|---|---|
| ISSN: | 0883-9514 1087-6545 |