Activations
Softmax
-
layer eddl::Softmax(layer parent, int axis = -1, string name = "")
Applies a Jacobian Softmax activation function to the given layer.
- Parameters
parent – Parent layer
axis – Dimension in which to operate. Default -1, which uses the last axis
name – Name of the layer
- Returns
Output of Softmax transformation
The Softmax activation function is: softmax(x) = exp(x) / reduce_sum(exp(x))
Example:
l = Softmax(l);
Sigmoid
-
layer eddl::Sigmoid(layer parent, string name = "")
Applies a Sigmoid activation function to the given layer.
- Parameters
parent – Parent layer
name – Name of the layer
- Returns
Output of Sigmoid activation
The Sigmoid activation function is: sigmoid(x) = 1 / (1 + exp(-x))
Example:
l = Sigmoid(l);
ReLu
-
layer eddl::ReLu(layer parent, string name = "")
Applies a Rectified Linear Unit activation function to the given layer.
- Parameters
parent – Parent layer
name – Name of the layer
- Returns
Output of ReLu activation
The ReLu activation function is:
if x > 0: relu(x) = x
else: relu(x) = 0
Example:
l = ReLu(l);
Thresholded ReLu
-
layer eddl::ThresholdedReLu(layer parent, float alpha = 1.0, string name = "")
Applies the Thresholded version of a Rectified Linear Unit activation function to the given layer.
- Parameters
parent – Parent layer
alpha – Threshold value
name – Name of the layer
- Returns
Output of Thresholded ReLu activation
The Thresholded ReLu activation function is:
if
x > alpha
:threshdolded_relu(x) = x
else:
thresholded_relu(x) = 0
Example:
l = ThresholdedReLu(l, 1.0);
Leaky ReLu
-
layer eddl::LeakyReLu(layer parent, float alpha = 0.01, string name = "")
Applies the Leaky version of a Rectified Linear Unit activation function to the given layer.
- Parameters
parent – Parent layer
alpha – Negative slope coefficient
name – Name of the layer
- Returns
Output of Leaky ReLu activation
The Leaky ReLu activation function is:
if
x > 0
:leaky_relu(x) = x
else:
leaky_relu(x) = alpha * x
Example:
l = LeakyReLu(l, 0.01);
ELu
-
layer eddl::Elu(layer parent, float alpha = 1.0, string name = "")
Applies the Exponential Linear Unit activation function to the given layer.
- Parameters
parent – Parent layer
alpha – ELu coefficient
name – Name of the layer
- Returns
Output of ELu activation
The ELu activation function is:
if
x > 0
:elu(x) = x
else:
elu(x) = alpha * (exp(x) - 1)
Example:
l = Elu(l, 1.0);
SeLu
-
layer eddl::Selu(layer parent, string name = "")
Applies the Scaled Exponential Linear Unit activation function to the given layer.
- Parameters
parent – Parent layer
name – Name of the layer
- Returns
Output of Selu activation
The SeLu activation function is:
if
x > 0
:selu(x) = scale * x
else:
selu(x) = scale * (alpha * (exp(x) - 1))
where alpha = 1.6732632423543772848170429916717
and scale = 1.0507009873554804934193349852946
Example:
l = Selu(l);
Exponential
-
layer eddl::Exponential(layer parent, string name = "")
Applies the Exponential (base e) activation function to the given layer.
- Parameters
parent – Parent layer
name – Name of the layer
- Returns
Output of Exponential activation
The Exponential activation function is: exp(x)
Example:
l = Exponential(l);
Softplus
-
layer eddl::Softplus(layer parent, string name = "")
Applies the Softplus activation function to the given layer.
- Parameters
parent – Parent layer
name – Name of the layer
- Returns
Output of Exponential activation
The Softplus activation function is: softplus(x) = log(1 + exp(x))
Example:
l = Softplus(l);
Softsign
-
layer eddl::Softsign(layer parent, string name = "")
Applies the Softsign activation function to the given layer.
- Parameters
parent – Parent layer
name – Name of the layer
- Returns
Output of Exponential activation
The Softsign activation function is: softsign(x) = x / (1 + abs(x))
Example:
l = Softsign(l);
Linear
-
layer eddl::Linear(layer parent, float alpha = 1.0, string name = "")
Applies the Linear activation function to the given layer.
- Parameters
parent – Parent layer
alpha – Linear coefficient
name – Name of the layer
- Returns
Output of Linear activation
The Linear activation function is: linear(x) = alpha * x
Example:
l = Linear(l, 1.0);
Tanh
-
layer eddl::Tanh(layer parent, string name = "")
Applies the Hyperbolic tangent activation function to the given layer.
- Parameters
parent – Parent layer
name – Name of the layer
- Returns
Output of hyperbolic activation
The Tanh activation function is: tanh(x) = sinh(x)/cosh(x) = ((exp(x) - exp(-x))/(exp(x) + exp(-x)))
Example:
l = Tanh(l);