Feature Representations Using the Reflected Rectified Linear Unit (RReLU) Activation
Deep Neural Networks (DNNs) have become the tool of choice for machine learning practitioners today. One important aspect of designing a neural network is the choice of the activation function to be used at the neurons of the different layers. In this work, we introduce a four-output activation func...
Saved in:
Main Authors: | Chaity Banerjee, Tathagata Mukherjee, Eduardo Pasiliao Jr. |
---|---|
Format: | Article |
Language: | English |
Published: |
Tsinghua University Press
2020-06-01
|
Series: | Big Data Mining and Analytics |
Subjects: | |
Online Access: | https://www.sciopen.com/article/10.26599/BDMA.2019.9020024 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Dual Input Interleaved Three-Phase Rectifier In Discontinuous Conduction Mode For Application In Small Wind Turbines
by: Guilherme M. Todys, et al.
Published: (2025-01-01) -
A low‐rating 40‐pulse AC–DC rectifier based on a new passive harmonic mitigation circuit
by: Rohollah Abdollahi, et al.
Published: (2022-12-01) -
Design of Self-Tuning Regulator Adaptive Cascade Control for Power Factor Correction in Boost Rectifier
by: Chiyo Saito, et al.
Published: (2024-12-01) -
Involvement of Inwardly Rectifying Potassium (Kir) Channels in the Toxicity of Flonicamid to <i>Drosophila melanogaster</i>
by: Xuan Liu, et al.
Published: (2025-01-01) -
Direct Torque Control of Dual Three Phase Induction Motor fed by Direct Power Control Rectifier using Fuzzy Logic Speed Controller
by: Radhwane Sadouni, et al.
Published: (2025-01-01)