Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions
Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this stu...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
LibraryPress@UF
2025-05-01
|
| Series: | Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| Online Access: | https://journals.flvc.org/FLAIRS/article/view/139005 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850135456777240576 |
|---|---|
| author | Pavan Reddy Aditya Sanjay Gujral |
| author_facet | Pavan Reddy Aditya Sanjay Gujral |
| author_sort | Pavan Reddy |
| collection | DOAJ |
| description |
Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this study, we demonstrate that a piecewise linear approximation of the tanh function, utilizing pre-calculated slopes, achieves faster computation without significant degradation in performance. Conversely, we show that a piecewise linear approximation of the sigmoid function is computationally slower compared to its continuous counterpart. These findings suggest that the computational efficiency of a piecewise activation function depends on whether the indexing and arithmetic costs of the approximation are lower than those of the continuous function.
|
| format | Article |
| id | doaj-art-84f6ae1f5cc948188f24034f53a4cd62 |
| institution | OA Journals |
| issn | 2334-0754 2334-0762 |
| language | English |
| publishDate | 2025-05-01 |
| publisher | LibraryPress@UF |
| record_format | Article |
| series | Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| spelling | doaj-art-84f6ae1f5cc948188f24034f53a4cd622025-08-20T02:31:26ZengLibraryPress@UFProceedings of the International Florida Artificial Intelligence Research Society Conference2334-07542334-07622025-05-0138110.32473/flairs.38.1.139005Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation FunctionsPavan Reddy0https://orcid.org/0009-0001-4832-1845Aditya Sanjay Gujral1The George Washington UniversityThe George Washington University Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this study, we demonstrate that a piecewise linear approximation of the tanh function, utilizing pre-calculated slopes, achieves faster computation without significant degradation in performance. Conversely, we show that a piecewise linear approximation of the sigmoid function is computationally slower compared to its continuous counterpart. These findings suggest that the computational efficiency of a piecewise activation function depends on whether the indexing and arithmetic costs of the approximation are lower than those of the continuous function. https://journals.flvc.org/FLAIRS/article/view/139005 |
| spellingShingle | Pavan Reddy Aditya Sanjay Gujral Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| title | Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions |
| title_full | Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions |
| title_fullStr | Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions |
| title_full_unstemmed | Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions |
| title_short | Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions |
| title_sort | improving neural network efficiency using piecewise linear approximation of activation functions |
| url | https://journals.flvc.org/FLAIRS/article/view/139005 |
| work_keys_str_mv | AT pavanreddy improvingneuralnetworkefficiencyusingpiecewiselinearapproximationofactivationfunctions AT adityasanjaygujral improvingneuralnetworkefficiencyusingpiecewiselinearapproximationofactivationfunctions |