Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligence

Abstract Background and objective Early detection and classification of skin cancer are critical for improving patient outcomes. Dermoscopic image analysis using Computer-Aided Diagnostics (CAD) is a powerful tool to assist dermatologists in identifying and classifying skin lesions. Traditional mach...

Full description

Saved in:
Bibliographic Details
Main Authors: Mehak Arshad, Muhammad Attique Khan, Nouf Abdullah Almujally, Areej Alasiry, Mehrez Marzougui, Yunyoung Nam
Format: Article
Language:English
Published: BMC 2025-07-01
Series:BMC Medical Informatics and Decision Making
Subjects:
Online Access:https://doi.org/10.1186/s12911-025-03051-2
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849334671855321088
author Mehak Arshad
Muhammad Attique Khan
Nouf Abdullah Almujally
Areej Alasiry
Mehrez Marzougui
Yunyoung Nam
author_facet Mehak Arshad
Muhammad Attique Khan
Nouf Abdullah Almujally
Areej Alasiry
Mehrez Marzougui
Yunyoung Nam
author_sort Mehak Arshad
collection DOAJ
description Abstract Background and objective Early detection and classification of skin cancer are critical for improving patient outcomes. Dermoscopic image analysis using Computer-Aided Diagnostics (CAD) is a powerful tool to assist dermatologists in identifying and classifying skin lesions. Traditional machine learning models require extensive feature engineering, which is time-consuming and less effective in handling complex data like skin lesions. This study proposes a deep learning-based network-level fusion architecture that integrates multiple deep models to enhance the classification and localization of skin lesions in dermoscopic images. The goal is to address challenges like irregular lesion shapes, inter-class similarities, and class imbalances while providing explainability through artificial intelligence. Methods A novel hybrid contrast enhancement technique was applied for pre-processing and dataset augmentation. Two deep learning models, a 5-block inverted residual network and a 6-block inverted bottleneck network, were designed and fused at the network level using a depth concatenation approach. The models were trained using Bayesian optimization for hyperparameter tuning. Feature extraction was performed with a global average pooling layer, and shallow neural networks were used for final classification. Explainable AI techniques, including LIME, were used to interpret model predictions and localize lesion regions. Experiments were conducted on two publicly available datasets, HAM10000 and ISIC2018, which were split into training and testing sets. Results The proposed fused architecture achieved high classification accuracy, with results of 91.3% and 90.7% on the HAM10000 and ISIC2018 datasets, respectively. Sensitivity, precision, and F1-scores were significantly improved after data augmentation, with precision rates of up to 90.91%. The explainable AI component effectively localized lesion areas with high confidence, enhancing the model’s interpretability. Conclusions The network-level fusion architecture combined with explainable AI techniques significantly improved the classification and localization of skin lesions. The augmentation and contrast enhancement processes enhanced lesion visibility, while fusion of models optimized classification accuracy. This approach shows potential for implementation in CAD systems for skin cancer diagnosis, although future work is required to address the limitations of computational resource requirements and training time. Clinical trail number Not applicable.
format Article
id doaj-art-10757269a8384d709fbf72bade832c2d
institution Kabale University
issn 1472-6947
language English
publishDate 2025-07-01
publisher BMC
record_format Article
series BMC Medical Informatics and Decision Making
spelling doaj-art-10757269a8384d709fbf72bade832c2d2025-08-20T03:45:30ZengBMCBMC Medical Informatics and Decision Making1472-69472025-07-0125112210.1186/s12911-025-03051-2Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligenceMehak Arshad0Muhammad Attique Khan1Nouf Abdullah Almujally2Areej Alasiry3Mehrez Marzougui4Yunyoung Nam5Department of Computer Science, HITEC UniversityDepartment of Computer Science, HITEC UniversityDepartment of Information Systems, College of Computer and Information Sciences, Princess Nourah Bint Abdulrahman UniversityCollege of Computer Science, King Khalid UniversityCollege of Computer Science, King Khalid UniversityDepartment of ICT Convergence, Soonchunhyang UniversityAbstract Background and objective Early detection and classification of skin cancer are critical for improving patient outcomes. Dermoscopic image analysis using Computer-Aided Diagnostics (CAD) is a powerful tool to assist dermatologists in identifying and classifying skin lesions. Traditional machine learning models require extensive feature engineering, which is time-consuming and less effective in handling complex data like skin lesions. This study proposes a deep learning-based network-level fusion architecture that integrates multiple deep models to enhance the classification and localization of skin lesions in dermoscopic images. The goal is to address challenges like irregular lesion shapes, inter-class similarities, and class imbalances while providing explainability through artificial intelligence. Methods A novel hybrid contrast enhancement technique was applied for pre-processing and dataset augmentation. Two deep learning models, a 5-block inverted residual network and a 6-block inverted bottleneck network, were designed and fused at the network level using a depth concatenation approach. The models were trained using Bayesian optimization for hyperparameter tuning. Feature extraction was performed with a global average pooling layer, and shallow neural networks were used for final classification. Explainable AI techniques, including LIME, were used to interpret model predictions and localize lesion regions. Experiments were conducted on two publicly available datasets, HAM10000 and ISIC2018, which were split into training and testing sets. Results The proposed fused architecture achieved high classification accuracy, with results of 91.3% and 90.7% on the HAM10000 and ISIC2018 datasets, respectively. Sensitivity, precision, and F1-scores were significantly improved after data augmentation, with precision rates of up to 90.91%. The explainable AI component effectively localized lesion areas with high confidence, enhancing the model’s interpretability. Conclusions The network-level fusion architecture combined with explainable AI techniques significantly improved the classification and localization of skin lesions. The augmentation and contrast enhancement processes enhanced lesion visibility, while fusion of models optimized classification accuracy. This approach shows potential for implementation in CAD systems for skin cancer diagnosis, although future work is required to address the limitations of computational resource requirements and training time. Clinical trail number Not applicable.https://doi.org/10.1186/s12911-025-03051-2DermoscopySkin cancerDeep learningNetwork-level fusionShallow neural networkExplainable artificial intelligence
spellingShingle Mehak Arshad
Muhammad Attique Khan
Nouf Abdullah Almujally
Areej Alasiry
Mehrez Marzougui
Yunyoung Nam
Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligence
BMC Medical Informatics and Decision Making
Dermoscopy
Skin cancer
Deep learning
Network-level fusion
Shallow neural network
Explainable artificial intelligence
title Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligence
title_full Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligence
title_fullStr Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligence
title_full_unstemmed Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligence
title_short Multiclass skin lesion classification and localziation from dermoscopic images using a novel network-level fused deep architecture and explainable artificial intelligence
title_sort multiclass skin lesion classification and localziation from dermoscopic images using a novel network level fused deep architecture and explainable artificial intelligence
topic Dermoscopy
Skin cancer
Deep learning
Network-level fusion
Shallow neural network
Explainable artificial intelligence
url https://doi.org/10.1186/s12911-025-03051-2
work_keys_str_mv AT mehakarshad multiclassskinlesionclassificationandlocalziationfromdermoscopicimagesusinganovelnetworklevelfuseddeeparchitectureandexplainableartificialintelligence
AT muhammadattiquekhan multiclassskinlesionclassificationandlocalziationfromdermoscopicimagesusinganovelnetworklevelfuseddeeparchitectureandexplainableartificialintelligence
AT noufabdullahalmujally multiclassskinlesionclassificationandlocalziationfromdermoscopicimagesusinganovelnetworklevelfuseddeeparchitectureandexplainableartificialintelligence
AT areejalasiry multiclassskinlesionclassificationandlocalziationfromdermoscopicimagesusinganovelnetworklevelfuseddeeparchitectureandexplainableartificialintelligence
AT mehrezmarzougui multiclassskinlesionclassificationandlocalziationfromdermoscopicimagesusinganovelnetworklevelfuseddeeparchitectureandexplainableartificialintelligence
AT yunyoungnam multiclassskinlesionclassificationandlocalziationfromdermoscopicimagesusinganovelnetworklevelfuseddeeparchitectureandexplainableartificialintelligence