AI::MXNet::Gluon::NN::Activation
|
DESCRIPTION
Applies an activation function to input.
Parameters
----------
activation : str
Name of activation function to use .
See mxnet.ndarray.Activation for available choices.
Input shape:
Arbitrary.
Output shape:
Same shape as input.
|
AI::MXNet::Gluon::NN::LeakyReLU - Leaky version of a Rectified Linear Unit.
|
DESCRIPTION
Leaky version of a Rectified Linear Unit.
It allows a small gradient when the unit is not active
Parameters
----------
alpha : float
slope coefficient for the negative half axis. Must be >= 0.
|
AI::MXNet::Gluon::NN::PReLU - Parametric leaky version of a Rectified Linear Unit.
|
DESCRIPTION
Parametric leaky version of a Rectified Linear Unit.
It learns a gradient when the unit is not active
Parameters
----------
alpha_initializer : Initializer
Initializer for the embeddings matrix.
|
AI::MXNet::Gluon::NN::ELU - Exponential Linear Unit (ELU)
|
DESCRIPTION
Exponential Linear Unit (ELU)
"Fast and Accurate Deep Network Learning by Exponential Linear Units" , Clevert et al, 2016
Published as a conference paper at ICLR 2016
Parameters
----------
alpha : float
The alpha parameter as described by Clevert et al, 2016
|
AI::MXNet::Gluon::NN::SELU - Scaled Exponential Linear Unit (SELU)
|
DESCRIPTION
Scaled Exponential Linear Unit (SELU)
"Self-Normalizing Neural Networks" , Klambauer et al, 2017
|
AI::MXNet::Gluon::NN::Swish - Swish Activation function
|
DESCRIPTION
Swish Activation function
Parameters
----------
beta : float
swish(x) = x * sigmoid(beta *x )
|