Feature Representations Using the Reflected Rectified Linear Unit (RReLU) Activation

Deep Neural Networks (DNNs) have become the tool of choice for machine learning practitioners today. One important aspect of designing a neural network is the choice of the activation function to be used at the neurons of the different layers. In this work, we introduce a four-output activation func...

Full description

Saved in:
Bibliographic Details
Main Authors: Chaity Banerjee, Tathagata Mukherjee, Eduardo Pasiliao Jr.
Format: Article
Language:English
Published: Tsinghua University Press 2020-06-01
Series:Big Data Mining and Analytics
Subjects:
Online Access:https://www.sciopen.com/article/10.26599/BDMA.2019.9020024
Tags: Add Tag
No Tags, Be the first to tag this record!