BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming
We introduce BERT mutation, a novel, domain-independent mutation operator for Genetic Programming (GP) that leverages advanced Natural Language Processing (NLP) techniques to improve convergence, particularly using the Masked Language Modeling approach. By combining the capabilities of deep reinforc...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-02-01
|
| Series: | Mathematics |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2227-7390/13/5/779 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850228195832365056 |
|---|---|
| author | Eliad Shem-Tov Moshe Sipper Achiya Elyasaf |
| author_facet | Eliad Shem-Tov Moshe Sipper Achiya Elyasaf |
| author_sort | Eliad Shem-Tov |
| collection | DOAJ |
| description | We introduce BERT mutation, a novel, domain-independent mutation operator for Genetic Programming (GP) that leverages advanced Natural Language Processing (NLP) techniques to improve convergence, particularly using the Masked Language Modeling approach. By combining the capabilities of deep reinforcement learning and the BERT transformer architecture, BERT mutation intelligently suggests node replacements within GP trees to enhance their fitness. Unlike traditional stochastic mutation methods, BERT mutation adapts dynamically by using historical fitness data to optimize mutation decisions, resulting in more effective evolutionary improvements. Through comprehensive evaluations across three benchmark domains, we demonstrate that BERT mutation significantly outperforms conventional and state-of-the-art mutation operators in terms of convergence speed and solution quality. This work represents a pivotal step toward integrating state-of-the-art deep learning into evolutionary algorithms, pushing the boundaries of adaptive optimization in GP. |
| format | Article |
| id | doaj-art-06586f2740244f76907d2b4db8dd2b93 |
| institution | OA Journals |
| issn | 2227-7390 |
| language | English |
| publishDate | 2025-02-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | Mathematics |
| spelling | doaj-art-06586f2740244f76907d2b4db8dd2b932025-08-20T02:04:36ZengMDPI AGMathematics2227-73902025-02-0113577910.3390/math13050779BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic ProgrammingEliad Shem-Tov0Moshe Sipper1Achiya Elyasaf2Department of Software and Information Systems Engineering, Ben-Gurion University of the Negev, Beer-Sheva 8410501, IsraelDepartment of Computer Science, Ben-Gurion University, Beer-Sheva 8410501, IsraelDepartment of Software and Information Systems Engineering, Ben-Gurion University of the Negev, Beer-Sheva 8410501, IsraelWe introduce BERT mutation, a novel, domain-independent mutation operator for Genetic Programming (GP) that leverages advanced Natural Language Processing (NLP) techniques to improve convergence, particularly using the Masked Language Modeling approach. By combining the capabilities of deep reinforcement learning and the BERT transformer architecture, BERT mutation intelligently suggests node replacements within GP trees to enhance their fitness. Unlike traditional stochastic mutation methods, BERT mutation adapts dynamically by using historical fitness data to optimize mutation decisions, resulting in more effective evolutionary improvements. Through comprehensive evaluations across three benchmark domains, we demonstrate that BERT mutation significantly outperforms conventional and state-of-the-art mutation operators in terms of convergence speed and solution quality. This work represents a pivotal step toward integrating state-of-the-art deep learning into evolutionary algorithms, pushing the boundaries of adaptive optimization in GP.https://www.mdpi.com/2227-7390/13/5/779genetic programmingmutation operatorreinforcement learningcombinatorial optimizationsurrogate modelsymbolic regression |
| spellingShingle | Eliad Shem-Tov Moshe Sipper Achiya Elyasaf BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming Mathematics genetic programming mutation operator reinforcement learning combinatorial optimization surrogate model symbolic regression |
| title | BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming |
| title_full | BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming |
| title_fullStr | BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming |
| title_full_unstemmed | BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming |
| title_short | BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming |
| title_sort | bert mutation deep transformer model for masked uniform mutation in genetic programming |
| topic | genetic programming mutation operator reinforcement learning combinatorial optimization surrogate model symbolic regression |
| url | https://www.mdpi.com/2227-7390/13/5/779 |
| work_keys_str_mv | AT eliadshemtov bertmutationdeeptransformermodelformaskeduniformmutationingeneticprogramming AT moshesipper bertmutationdeeptransformermodelformaskeduniformmutationingeneticprogramming AT achiyaelyasaf bertmutationdeeptransformermodelformaskeduniformmutationingeneticprogramming |