Evolutionary search algorithm for learning activation function of an artificial neural network
Neural networks require careful selection of activation functions to optimize performance. Traditional methods of choosing activation functions through trial and error are time-consuming and resource-intensive. This paper presents a novel approach to automatically design activation functions for art...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
EDP Sciences
2025-01-01
|
| Series: | ITM Web of Conferences |
| Online Access: | https://www.itm-conferences.org/articles/itmconf/pdf/2025/03/itmconf_hmmocs-III2024_05004.pdf |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Neural networks require careful selection of activation functions to optimize performance. Traditional methods of choosing activation functions through trial and error are time-consuming and resource-intensive. This paper presents a novel approach to automatically design activation functions for artificial neural networks using genetic programming combined with gradient descent. The proposed method aims to enhance the efficiency of the search process for optimal activation functions. Our algorithm employs genetic programming to evolve the general form of activation functions, while gradient descent optimizes their parameters during network training. This hybrid approach allows for the exploration of a wide range of potential activation functions tailored to specific tasks and network architectures. The method was evaluated on three datasets from the KEEL repository: Iris, Titanic, and Phoneme. The results demonstrate the algorithm's ability to generate and optimize custom activation functions, although improvements in network accuracy were not observed in this initial study. This work contributes to the ongoing research in neural network optimization and opens avenues for further investigation into the automatic design of activation functions. |
|---|---|
| ISSN: | 2271-2097 |