Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model

Abstract Stock market prediction has long attracted the attention of academia and industry due to its potential for substantial financial returns. Despite the availability of various forecasting methods, such as CNN, LSTM, BiLSTM, GRU, and Transformer, the hyperparameter optimization of these models...

Full description

Saved in:
Bibliographic Details
Main Authors: Dedai Wei, Zimo Wang, Minyu Qiu, Juntao Yu, Jiaquan Yu, Yurun Jin, Xinye Sha, Kaichen Ouyang
Format: Article
Language:English
Published: Nature Portfolio 2025-02-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-88883-8
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849723940997431296
author Dedai Wei
Zimo Wang
Minyu Qiu
Juntao Yu
Jiaquan Yu
Yurun Jin
Xinye Sha
Kaichen Ouyang
author_facet Dedai Wei
Zimo Wang
Minyu Qiu
Juntao Yu
Jiaquan Yu
Yurun Jin
Xinye Sha
Kaichen Ouyang
author_sort Dedai Wei
collection DOAJ
description Abstract Stock market prediction has long attracted the attention of academia and industry due to its potential for substantial financial returns. Despite the availability of various forecasting methods, such as CNN, LSTM, BiLSTM, GRU, and Transformer, the hyperparameter optimization of these models often faces limitations, particularly in single-objective optimization, where they can easily fall into local optima. To address this issue, this paper proposes an innovative multi-objective optimization algorithm—the Multi-Objective Escape Bird Algorithm (MOEBS)—and introduces the MOEBS-Transformer architecture to enhance the efficiency and effectiveness of hyper-parameter optimization for Transformer models. This study first validates the performance of MOEBS through a series of multi-objective benchmark tests on standard problem sets such as ZDT, DTLZ, and WFG, comparing it with other multi-objective optimization algorithms (e.g., MOMVO, MSSA, and MOEAD) using evaluation metrics such as GD, Spacing, IGD, and HV for comprehensive analysis. In the context of stock price prediction, we select the closing price datasets of Amazon, Google, and Uniqlo, using MOEBS to optimize the core hyper parameters of the Transformer while considering multiple objectives, including training set RMSE, testing set RMSE, and testing set error variance. In the experiments, this paper first compares CNN, LSTM, BiLSTM, GRU, and traditional Transformer models to establish the Transformer as the optimal model for stock market prediction. Subsequently, the study compares the MOEBS-Transformer with Transformer models optimized using various hyperparameter optimization methods, including MOMVO-Transformer, MSSA-Transformer, and MOEAD-Transformer. Additionally, it evaluates Transformer models optimized through conventional methods: Random Search (RS-Transformer), Grid Search (GS-Transformer), and Bayesian Optimization (BO-Transformer). By assessing the performance of these models using R2, RMSE, and RPD metrics on both training and testing sets, the results demonstrate that the Transformer model optimized by MOEBS significantly outperforms the other methods in terms of prediction accuracy and prediction stability. This research offers a new solution for complex optimization scenarios and lays a foundation for advancements in stock market prediction technologies.
format Article
id doaj-art-4f6f4881fce04077b5ad271f88de95ee
institution DOAJ
issn 2045-2322
language English
publishDate 2025-02-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-4f6f4881fce04077b5ad271f88de95ee2025-08-20T03:10:53ZengNature PortfolioScientific Reports2045-23222025-02-0115115110.1038/s41598-025-88883-8Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer modelDedai Wei0Zimo Wang1Minyu Qiu2Juntao Yu3Jiaquan Yu4Yurun Jin5Xinye Sha6Kaichen Ouyang7College of Economics, Shenyang UniversityDepartment of Philosophy, Nankai UniversityCollege of Information Engineering, China Jiliang UniversityDepartment of Mathematics, University of Science and Technology of ChinaDepartment of Mathematics, University of Science and Technology of ChinaSchool of Computer Science and Technology, University of Science and Technology of ChinaGraduate School of Arts and Sciences, Columbia UniversityDepartment of Mathematics, University of Science and Technology of ChinaAbstract Stock market prediction has long attracted the attention of academia and industry due to its potential for substantial financial returns. Despite the availability of various forecasting methods, such as CNN, LSTM, BiLSTM, GRU, and Transformer, the hyperparameter optimization of these models often faces limitations, particularly in single-objective optimization, where they can easily fall into local optima. To address this issue, this paper proposes an innovative multi-objective optimization algorithm—the Multi-Objective Escape Bird Algorithm (MOEBS)—and introduces the MOEBS-Transformer architecture to enhance the efficiency and effectiveness of hyper-parameter optimization for Transformer models. This study first validates the performance of MOEBS through a series of multi-objective benchmark tests on standard problem sets such as ZDT, DTLZ, and WFG, comparing it with other multi-objective optimization algorithms (e.g., MOMVO, MSSA, and MOEAD) using evaluation metrics such as GD, Spacing, IGD, and HV for comprehensive analysis. In the context of stock price prediction, we select the closing price datasets of Amazon, Google, and Uniqlo, using MOEBS to optimize the core hyper parameters of the Transformer while considering multiple objectives, including training set RMSE, testing set RMSE, and testing set error variance. In the experiments, this paper first compares CNN, LSTM, BiLSTM, GRU, and traditional Transformer models to establish the Transformer as the optimal model for stock market prediction. Subsequently, the study compares the MOEBS-Transformer with Transformer models optimized using various hyperparameter optimization methods, including MOMVO-Transformer, MSSA-Transformer, and MOEAD-Transformer. Additionally, it evaluates Transformer models optimized through conventional methods: Random Search (RS-Transformer), Grid Search (GS-Transformer), and Bayesian Optimization (BO-Transformer). By assessing the performance of these models using R2, RMSE, and RPD metrics on both training and testing sets, the results demonstrate that the Transformer model optimized by MOEBS significantly outperforms the other methods in terms of prediction accuracy and prediction stability. This research offers a new solution for complex optimization scenarios and lays a foundation for advancements in stock market prediction technologies.https://doi.org/10.1038/s41598-025-88883-8Stock price predictionMulti-objective optimizationEscaping bird searchTransformerHyper-parameter optimization
spellingShingle Dedai Wei
Zimo Wang
Minyu Qiu
Juntao Yu
Jiaquan Yu
Yurun Jin
Xinye Sha
Kaichen Ouyang
Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model
Scientific Reports
Stock price prediction
Multi-objective optimization
Escaping bird search
Transformer
Hyper-parameter optimization
title Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model
title_full Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model
title_fullStr Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model
title_full_unstemmed Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model
title_short Multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model
title_sort multiple objectives escaping bird search optimization and its application in stock market prediction based on transformer model
topic Stock price prediction
Multi-objective optimization
Escaping bird search
Transformer
Hyper-parameter optimization
url https://doi.org/10.1038/s41598-025-88883-8
work_keys_str_mv AT dedaiwei multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel
AT zimowang multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel
AT minyuqiu multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel
AT juntaoyu multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel
AT jiaquanyu multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel
AT yurunjin multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel
AT xinyesha multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel
AT kaichenouyang multipleobjectivesescapingbirdsearchoptimizationanditsapplicationinstockmarketpredictionbasedontransformermodel