A Heat Decay Model-Based Hybrid Sampling Algorithm for Imbalanced Overlapping Datasets

Imbalanced datasets pose significant challenges to classification tasks, as traditional classifiers often favor majority classes. Although numerous methods have been proposed to balance data distributions, recent studies identify that imbalanced datasets frequently exhibit complex intrinsic characte...

Full description

Saved in:
Bibliographic Details
Main Authors: Liangliang Tao, Lilin Zhang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11017636/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Imbalanced datasets pose significant challenges to classification tasks, as traditional classifiers often favor majority classes. Although numerous methods have been proposed to balance data distributions, recent studies identify that imbalanced datasets frequently exhibit complex intrinsic characteristics, among which class overlap is the most harmful to classification performance. To this end, we propose a novel sampling algorithm called Heat Decay Model-Based Hybrid Sampling (HDHS). The proposed method comprises three key steps. First, it calculates the weight of each minority class sample using density and closeness factors, addressing within-class imbalance and initializing the heat of each minority class sample. Second, the heat decay model is employed to dynamically expand the sampling region until thermal equilibrium is reached, simultaneously removing majority class samples from the extended region. Finally, a weighted oversampling strategy is applied to synthesize new minority samples within the adjusted space. Experimental evaluations conducted on 50 imbalanced datasets demonstrate that HDHS significantly outperforms 10 state-of-the-art methods in terms of AUC, G-mean, and F-measure. This superiority is especially pronounced on datasets with a high degree of class overlap. Moreover, HDHS exhibits generalization capability across various classification paradigms, consistently achieving superior performance when integrated with classifiers such as KNN, CART, NB, and SVM.
ISSN:2169-3536