AReLU: Agile Rectified Linear Unit for Improving Lightweight Convolutional Neural Networks

Dynamic activation functions usually gain remarkable improvements for neural networks. Dynamic activation functions depending on input features show better performance than the input-independents. But the improvements are achieved with extra memory and computational cost, which is non-negligible for...

Full description

Saved in:
Bibliographic Details
Main Authors: Fu Chen, Yepeng Guan
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10843665/
Tags: Add Tag
No Tags, Be the first to tag this record!