A Green AI Methodology Based on Persistent Homology for Compressing BERT

Large Language Models (LLMs) like BERT have gained significant prominence due to their remarkable performance in various natural language processing tasks. However, they come with substantial computational and memory costs. Additionally, they are essentially black-box models, being challenging to ex...

Full description

Saved in:
Bibliographic Details
Main Authors: Luis Balderas, Miguel Lastra, José M. Benítez
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/1/390
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850113451771297792
author Luis Balderas
Miguel Lastra
José M. Benítez
author_facet Luis Balderas
Miguel Lastra
José M. Benítez
author_sort Luis Balderas
collection DOAJ
description Large Language Models (LLMs) like BERT have gained significant prominence due to their remarkable performance in various natural language processing tasks. However, they come with substantial computational and memory costs. Additionally, they are essentially black-box models, being challenging to explain and interpret. In this article, Persistent BERT Compression and Explainability (PBCE) is proposed, a Green AI methodology to prune BERT models using persistent homology, aiming to measure the importance of each neuron by studying the topological characteristics of their outputs. As a result, PBCE can compress BERT significantly by reducing the number of parameters (47% of the original parameters for BERT Base, 42% for BERT Large). The proposed methodology has been evaluated on the standard GLUE Benchmark, comparing the results with state-of-the-art techniques achieving outstanding results. Consequently, PBCE can simplify the BERT model by providing explainability to its neurons and reducing the model’s size, making it more suitable for deployment on resource-constrained devices.
format Article
id doaj-art-e541bd96739b4df2865041a4050573b2
institution OA Journals
issn 2076-3417
language English
publishDate 2025-01-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj-art-e541bd96739b4df2865041a4050573b22025-08-20T02:37:09ZengMDPI AGApplied Sciences2076-34172025-01-0115139010.3390/app15010390A Green AI Methodology Based on Persistent Homology for Compressing BERTLuis Balderas0Miguel Lastra1José M. Benítez2Department of Computer Science and Artificial Intelligence, University of Granada, 18071 Granada, SpainDistributed Computational Intelligence and Time Series Lab, University of Granada, 18071 Granada, SpainDepartment of Computer Science and Artificial Intelligence, University of Granada, 18071 Granada, SpainLarge Language Models (LLMs) like BERT have gained significant prominence due to their remarkable performance in various natural language processing tasks. However, they come with substantial computational and memory costs. Additionally, they are essentially black-box models, being challenging to explain and interpret. In this article, Persistent BERT Compression and Explainability (PBCE) is proposed, a Green AI methodology to prune BERT models using persistent homology, aiming to measure the importance of each neuron by studying the topological characteristics of their outputs. As a result, PBCE can compress BERT significantly by reducing the number of parameters (47% of the original parameters for BERT Base, 42% for BERT Large). The proposed methodology has been evaluated on the standard GLUE Benchmark, comparing the results with state-of-the-art techniques achieving outstanding results. Consequently, PBCE can simplify the BERT model by providing explainability to its neurons and reducing the model’s size, making it more suitable for deployment on resource-constrained devices.https://www.mdpi.com/2076-3417/15/1/390BERT compressionGreen AIpersistent homologyneural network explainability
spellingShingle Luis Balderas
Miguel Lastra
José M. Benítez
A Green AI Methodology Based on Persistent Homology for Compressing BERT
Applied Sciences
BERT compression
Green AI
persistent homology
neural network explainability
title A Green AI Methodology Based on Persistent Homology for Compressing BERT
title_full A Green AI Methodology Based on Persistent Homology for Compressing BERT
title_fullStr A Green AI Methodology Based on Persistent Homology for Compressing BERT
title_full_unstemmed A Green AI Methodology Based on Persistent Homology for Compressing BERT
title_short A Green AI Methodology Based on Persistent Homology for Compressing BERT
title_sort green ai methodology based on persistent homology for compressing bert
topic BERT compression
Green AI
persistent homology
neural network explainability
url https://www.mdpi.com/2076-3417/15/1/390
work_keys_str_mv AT luisbalderas agreenaimethodologybasedonpersistenthomologyforcompressingbert
AT miguellastra agreenaimethodologybasedonpersistenthomologyforcompressingbert
AT josembenitez agreenaimethodologybasedonpersistenthomologyforcompressingbert
AT luisbalderas greenaimethodologybasedonpersistenthomologyforcompressingbert
AT miguellastra greenaimethodologybasedonpersistenthomologyforcompressingbert
AT josembenitez greenaimethodologybasedonpersistenthomologyforcompressingbert