Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions
Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this stu...
Saved in:
| Main Authors: | Pavan Reddy, Aditya Sanjay Gujral |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
LibraryPress@UF
2025-05-01
|
| Series: | Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| Online Access: | https://journals.flvc.org/FLAIRS/article/view/139005 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Dynamics of recurrent neural networks with piecewise linear activation function in the context-dependent decision-making task
by: Kononov, Roman Andreevich, et al.
Published: (2025-03-01) -
Localizing Adversarial Attacks To Produces More Imperceptible Noise
by: Pavan Reddy, et al.
Published: (2025-05-01) -
Spectral Method of Identification of Peltier Thermoelectric Elements Based on Piecewise Linear Approximation
by: Gleb Vasilyev, et al.
Published: (2023-12-01) -
Piecewise Linear Approximation of Elliptical Neutron Guides—A Case Study for BIFROST at ESS
by: Daniel Lomholt Christensen, et al.
Published: (2025-02-01) -
Progressive Bounded Error Piecewise Linear Approximation with Resolution Reduction for Time Series Data Compression
by: Jeng-Wei Lin, et al.
Published: (2024-12-01)