Evolutionary search algorithm for learning activation function of an artificial neural network
Neural networks require careful selection of activation functions to optimize performance. Traditional methods of choosing activation functions through trial and error are time-consuming and resource-intensive. This paper presents a novel approach to automatically design activation functions for art...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
EDP Sciences
2025-01-01
|
| Series: | ITM Web of Conferences |
| Online Access: | https://www.itm-conferences.org/articles/itmconf/pdf/2025/03/itmconf_hmmocs-III2024_05004.pdf |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850195541163507712 |
|---|---|
| author | Yurshin Viacheslav |
| author_facet | Yurshin Viacheslav |
| author_sort | Yurshin Viacheslav |
| collection | DOAJ |
| description | Neural networks require careful selection of activation functions to optimize performance. Traditional methods of choosing activation functions through trial and error are time-consuming and resource-intensive. This paper presents a novel approach to automatically design activation functions for artificial neural networks using genetic programming combined with gradient descent. The proposed method aims to enhance the efficiency of the search process for optimal activation functions. Our algorithm employs genetic programming to evolve the general form of activation functions, while gradient descent optimizes their parameters during network training. This hybrid approach allows for the exploration of a wide range of potential activation functions tailored to specific tasks and network architectures. The method was evaluated on three datasets from the KEEL repository: Iris, Titanic, and Phoneme. The results demonstrate the algorithm's ability to generate and optimize custom activation functions, although improvements in network accuracy were not observed in this initial study. This work contributes to the ongoing research in neural network optimization and opens avenues for further investigation into the automatic design of activation functions. |
| format | Article |
| id | doaj-art-aafd2a1fecac4ec08568b8248f0de264 |
| institution | OA Journals |
| issn | 2271-2097 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | EDP Sciences |
| record_format | Article |
| series | ITM Web of Conferences |
| spelling | doaj-art-aafd2a1fecac4ec08568b8248f0de2642025-08-20T02:13:44ZengEDP SciencesITM Web of Conferences2271-20972025-01-01720500410.1051/itmconf/20257205004itmconf_hmmocs-III2024_05004Evolutionary search algorithm for learning activation function of an artificial neural networkYurshin Viacheslav0Siberian Federal University, Institute of Space and Information TechnologiesNeural networks require careful selection of activation functions to optimize performance. Traditional methods of choosing activation functions through trial and error are time-consuming and resource-intensive. This paper presents a novel approach to automatically design activation functions for artificial neural networks using genetic programming combined with gradient descent. The proposed method aims to enhance the efficiency of the search process for optimal activation functions. Our algorithm employs genetic programming to evolve the general form of activation functions, while gradient descent optimizes their parameters during network training. This hybrid approach allows for the exploration of a wide range of potential activation functions tailored to specific tasks and network architectures. The method was evaluated on three datasets from the KEEL repository: Iris, Titanic, and Phoneme. The results demonstrate the algorithm's ability to generate and optimize custom activation functions, although improvements in network accuracy were not observed in this initial study. This work contributes to the ongoing research in neural network optimization and opens avenues for further investigation into the automatic design of activation functions.https://www.itm-conferences.org/articles/itmconf/pdf/2025/03/itmconf_hmmocs-III2024_05004.pdf |
| spellingShingle | Yurshin Viacheslav Evolutionary search algorithm for learning activation function of an artificial neural network ITM Web of Conferences |
| title | Evolutionary search algorithm for learning activation function of an artificial neural network |
| title_full | Evolutionary search algorithm for learning activation function of an artificial neural network |
| title_fullStr | Evolutionary search algorithm for learning activation function of an artificial neural network |
| title_full_unstemmed | Evolutionary search algorithm for learning activation function of an artificial neural network |
| title_short | Evolutionary search algorithm for learning activation function of an artificial neural network |
| title_sort | evolutionary search algorithm for learning activation function of an artificial neural network |
| url | https://www.itm-conferences.org/articles/itmconf/pdf/2025/03/itmconf_hmmocs-III2024_05004.pdf |
| work_keys_str_mv | AT yurshinviacheslav evolutionarysearchalgorithmforlearningactivationfunctionofanartificialneuralnetwork |