Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box Approach

Predicting bank failures is a critical task requiring balancing the need for model explainability with the necessity of preserving data privacy. Traditional machine learning models often lack transparency, which poses challenges for stakeholders who need to understand the factors leading to predicti...

Full description

Saved in:
Bibliographic Details
Main Authors: Junyoung Byun, Jaewook Lee, Hyeongyeong Lee, Bumho Son
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10818483/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1825207008295387136
author Junyoung Byun
Jaewook Lee
Hyeongyeong Lee
Bumho Son
author_facet Junyoung Byun
Jaewook Lee
Hyeongyeong Lee
Bumho Son
author_sort Junyoung Byun
collection DOAJ
description Predicting bank failures is a critical task requiring balancing the need for model explainability with the necessity of preserving data privacy. Traditional machine learning models often lack transparency, which poses challenges for stakeholders who need to understand the factors leading to predictions. In this study, we employ differentially private glass-box models, namely Explainable Boosting Machine (EBM) and Neural Additive Models (NAM), to address these issues. We analyzed data from 21,243 American banks spanning from 1969 to 2021, focusing on key financial ratios. By applying Differential Privacy (DP) to these models, we aimed to protect sensitive financial data while evaluating the trade-offs between privacy, accuracy, and explainability. Our main findings are as follows: 1) In the absence of privacy constraints, the models consistently identified Asset Turnover, Total Debt / Invested Capital, and ROE ratios as the most influential factors in predicting bank failure, in that order; 2) When the privacy budget <inline-formula> <tex-math notation="LaTeX">$\epsilon \leq 1$ </tex-math></inline-formula>, only EBM maintained significant performance; 3) The reduction in explainability due to privacy protection was more pronounced for variables with initially lower explanatory power, while Asset Turnover retained its explanatory power even at <inline-formula> <tex-math notation="LaTeX">$\epsilon = 0.01$ </tex-math></inline-formula>. These findings provide valuable insights for banks, policymakers, and investors, suggesting that glass-box models can offer a promising solution for reliable and explainable bank failure prediction under privacy constraints.
format Article
id doaj-art-3105c36b10ce4536a2162ee97d5ccf7c
institution Kabale University
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-3105c36b10ce4536a2162ee97d5ccf7c2025-02-07T00:01:54ZengIEEEIEEE Access2169-35362025-01-01131546156510.1109/ACCESS.2024.352396710818483Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box ApproachJunyoung Byun0Jaewook Lee1https://orcid.org/0000-0001-5720-8337Hyeongyeong Lee2Bumho Son3https://orcid.org/0000-0003-0856-7039Department of Applied Statistics, Chung-Ang University, Seoul, Republic of KoreaDepartment of Industrial Engineering, Seoul National University, Seoul, Republic of KoreaDepartment of Applied Statistics, Chung-Ang University, Seoul, Republic of KoreaDepartment of Business Administration, Chung-Ang University, Seoul, Republic of KoreaPredicting bank failures is a critical task requiring balancing the need for model explainability with the necessity of preserving data privacy. Traditional machine learning models often lack transparency, which poses challenges for stakeholders who need to understand the factors leading to predictions. In this study, we employ differentially private glass-box models, namely Explainable Boosting Machine (EBM) and Neural Additive Models (NAM), to address these issues. We analyzed data from 21,243 American banks spanning from 1969 to 2021, focusing on key financial ratios. By applying Differential Privacy (DP) to these models, we aimed to protect sensitive financial data while evaluating the trade-offs between privacy, accuracy, and explainability. Our main findings are as follows: 1) In the absence of privacy constraints, the models consistently identified Asset Turnover, Total Debt / Invested Capital, and ROE ratios as the most influential factors in predicting bank failure, in that order; 2) When the privacy budget <inline-formula> <tex-math notation="LaTeX">$\epsilon \leq 1$ </tex-math></inline-formula>, only EBM maintained significant performance; 3) The reduction in explainability due to privacy protection was more pronounced for variables with initially lower explanatory power, while Asset Turnover retained its explanatory power even at <inline-formula> <tex-math notation="LaTeX">$\epsilon = 0.01$ </tex-math></inline-formula>. These findings provide valuable insights for banks, policymakers, and investors, suggesting that glass-box models can offer a promising solution for reliable and explainable bank failure prediction under privacy constraints.https://ieeexplore.ieee.org/document/10818483/Bank failure predictiondifferential privacyexplainable artificial intelligenceexplainable boosting machineneural additive models
spellingShingle Junyoung Byun
Jaewook Lee
Hyeongyeong Lee
Bumho Son
Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box Approach
IEEE Access
Bank failure prediction
differential privacy
explainable artificial intelligence
explainable boosting machine
neural additive models
title Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box Approach
title_full Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box Approach
title_fullStr Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box Approach
title_full_unstemmed Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box Approach
title_short Balancing Explainability and Privacy in Bank Failure Prediction: A Differentially Private Glass-Box Approach
title_sort balancing explainability and privacy in bank failure prediction a differentially private glass box approach
topic Bank failure prediction
differential privacy
explainable artificial intelligence
explainable boosting machine
neural additive models
url https://ieeexplore.ieee.org/document/10818483/
work_keys_str_mv AT junyoungbyun balancingexplainabilityandprivacyinbankfailurepredictionadifferentiallyprivateglassboxapproach
AT jaewooklee balancingexplainabilityandprivacyinbankfailurepredictionadifferentiallyprivateglassboxapproach
AT hyeongyeonglee balancingexplainabilityandprivacyinbankfailurepredictionadifferentiallyprivateglassboxapproach
AT bumhoson balancingexplainabilityandprivacyinbankfailurepredictionadifferentiallyprivateglassboxapproach