Influence-Balanced XGBoost: Improving XGBoost for Imbalanced Data Using Influence Functions

Decision tree boosting algorithms, such as XGBoost, have demonstrated superior predictive performance on tabular data for supervised learning compared to neural networks. However, recent studies on loss functions for imbalanced data have primarily focused on deep learning. The goal of this study is...

Full description

Saved in:
Bibliographic Details
Main Authors: Akiyoshi Sutou, Jinfang Wang
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10807295/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Decision tree boosting algorithms, such as XGBoost, have demonstrated superior predictive performance on tabular data for supervised learning compared to neural networks. However, recent studies on loss functions for imbalanced data have primarily focused on deep learning. The goal of this study is to improve the XGBoost algorithm for better performance on unbalanced data. To this end, Influence-balanced loss (IBL), originally introduced in deep learning, was applied to enhance the performance of the XGBoost algorithm. As a side effect, the proposed method was also found to perform well on datasets prone to over-specialization. Furthermore, we conducted a comparison between the proposed method and conventional techniques using 38 publicly available datasets. Our method outperforms other methods in terms of F1-score and Matthews correlation coefficient.
ISSN:2169-3536