A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction

Abstract Accurate vulnerability prediction is crucial for identifying potential security risks in software, especially in the context of imbalanced and complex real-world datasets. Traditional methods, such as single-task learning and ensemble approaches, often struggle with these challenges, partic...

Full description

Saved in:
Bibliographic Details
Main Authors: Lan Liu, Zhanfa Hui, Guiming Chen, Tingfeng Cai, Chiyu Zhou
Format: Article
Language:English
Published: Nature Portfolio 2025-08-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-10650-6
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849235509478424576
author Lan Liu
Zhanfa Hui
Guiming Chen
Tingfeng Cai
Chiyu Zhou
author_facet Lan Liu
Zhanfa Hui
Guiming Chen
Tingfeng Cai
Chiyu Zhou
author_sort Lan Liu
collection DOAJ
description Abstract Accurate vulnerability prediction is crucial for identifying potential security risks in software, especially in the context of imbalanced and complex real-world datasets. Traditional methods, such as single-task learning and ensemble approaches, often struggle with these challenges, particularly in detecting rare but critical vulnerabilities. To address this, we propose the MTLPT: Multi-Task Learning with Position Encoding and Lightweight Transformer for Vulnerability Prediction, a novel multi-task learning framework that leverages custom lightweight Transformer blocks and position encoding layers to effectively capture long-range dependencies and complex patterns in source code. The MTLPT model improves sensitivity to rare vulnerabilities and incorporates a dynamic weight loss function to adjust for imbalanced data. Our experiments on real-world vulnerability datasets demonstrate that MTLPT outperforms traditional methods in key performance metrics such as recall, F1-score, AUC, and MCC. Ablation studies further validate the contributions of the lightweight Transformer blocks, position encoding layers, and dynamic weight loss function, confirming their role in enhancing the model’s predictive accuracy and efficiency.
format Article
id doaj-art-03cb8f157c8f431e880674d2715db2ca
institution Kabale University
issn 2045-2322
language English
publishDate 2025-08-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-03cb8f157c8f431e880674d2715db2ca2025-08-20T04:02:45ZengNature PortfolioScientific Reports2045-23222025-08-0115111710.1038/s41598-025-10650-6A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability predictionLan Liu0Zhanfa Hui1Guiming Chen2Tingfeng Cai3Chiyu Zhou4School of Electronic and Information Engineering, Guangdong Polytechnic Normal UniversitySchool of Electronic and Information Engineering, Guangdong Polytechnic Normal UniversitySchool of Electronic and Information Engineering, Guangdong Polytechnic Normal UniversitySchool of Electronic and Information Engineering, Guangdong Polytechnic Normal UniversitySchool of Electronic and Information Engineering, Guangdong Polytechnic Normal UniversityAbstract Accurate vulnerability prediction is crucial for identifying potential security risks in software, especially in the context of imbalanced and complex real-world datasets. Traditional methods, such as single-task learning and ensemble approaches, often struggle with these challenges, particularly in detecting rare but critical vulnerabilities. To address this, we propose the MTLPT: Multi-Task Learning with Position Encoding and Lightweight Transformer for Vulnerability Prediction, a novel multi-task learning framework that leverages custom lightweight Transformer blocks and position encoding layers to effectively capture long-range dependencies and complex patterns in source code. The MTLPT model improves sensitivity to rare vulnerabilities and incorporates a dynamic weight loss function to adjust for imbalanced data. Our experiments on real-world vulnerability datasets demonstrate that MTLPT outperforms traditional methods in key performance metrics such as recall, F1-score, AUC, and MCC. Ablation studies further validate the contributions of the lightweight Transformer blocks, position encoding layers, and dynamic weight loss function, confirming their role in enhancing the model’s predictive accuracy and efficiency.https://doi.org/10.1038/s41598-025-10650-6Vulnerability predictionPosition encodingLightweight transformerDynamic weightsMulti-task learning
spellingShingle Lan Liu
Zhanfa Hui
Guiming Chen
Tingfeng Cai
Chiyu Zhou
A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction
Scientific Reports
Vulnerability prediction
Position encoding
Lightweight transformer
Dynamic weights
Multi-task learning
title A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction
title_full A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction
title_fullStr A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction
title_full_unstemmed A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction
title_short A lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction
title_sort lightweight transformer based multi task learning model with dynamic weight allocation for improved vulnerability prediction
topic Vulnerability prediction
Position encoding
Lightweight transformer
Dynamic weights
Multi-task learning
url https://doi.org/10.1038/s41598-025-10650-6
work_keys_str_mv AT lanliu alightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT zhanfahui alightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT guimingchen alightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT tingfengcai alightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT chiyuzhou alightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT lanliu lightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT zhanfahui lightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT guimingchen lightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT tingfengcai lightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction
AT chiyuzhou lightweighttransformerbasedmultitasklearningmodelwithdynamicweightallocationforimprovedvulnerabilityprediction