AReLU: Agile Rectified Linear Unit for Improving Lightweight Convolutional Neural Networks
Dynamic activation functions usually gain remarkable improvements for neural networks. Dynamic activation functions depending on input features show better performance than the input-independents. But the improvements are achieved with extra memory and computational cost, which is non-negligible for...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10843665/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|