Version [1.0.3] — [CACP: Classification Algorithms Comparison Pipeline]

We present the first major release of the Classification Algorithms Comparison Pipeline (CACP). The proposed software enables one to compare newly developed classification algorithms in Python with other classifiers to evaluate classification performance and ensure both outcomes’ reproducibility and...

Full description

Saved in:
Bibliographic Details
Main Authors: Sylwester Czmil, Jacek Kluska, Anna Czmil
Format: Article
Language:English
Published: Elsevier 2024-12-01
Series:SoftwareX
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S235271102400308X
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present the first major release of the Classification Algorithms Comparison Pipeline (CACP). The proposed software enables one to compare newly developed classification algorithms in Python with other classifiers to evaluate classification performance and ensure both outcomes’ reproducibility and statistical reliability. CACP simplifies and accelerates the entire classifier evaluation process considerably and helps prepare the professional documentation of the experiments conducted. The upgrade introduces enhancements to existing tools and adds new features: (1) - support for River machine learning library datasets in incremental learning, (2) - capability to include user-defined datasets, (3) - use of River classifiers for incremental learning, (4) - use of River metrics for incremental learning, (5) - flexibility to create user-defined metrics, (6) - record-by-record testing for incremental learning, (7) - enhanced summary of incremental testing results with dynamic visualization of the learning process, (8) - Graphical User Interface (GUI).
ISSN:2352-7110