Normalization
BatchNormalization
-
layer eddl::BatchNormalization(layer parent, float momentum = 0.99f, float epsilon = 0.001f, bool affine = true, string name = "")
Batch normalization layer.
Normalize the activations of the previous layer at each batch, i.e. applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.
- Parameters
parent – Parent layer
momentum – Momentum for the moving mean and the moving variance
epsilon – Small float added to variance to avoid dividing by zero
affine – If True, this module has learnable affine parameters
name – A name for the operation
- Returns
Parent layer after the normalization
Example:
l = BatchNormalization(l);
LayerNormalization
-
layer eddl::LayerNormalization(layer parent, float epsilon = 0.00001f, bool affine = true, string name = "")
Layer normalization layer.
Applies Layer Normalization over an input.
- Parameters
parent – Parent layer
epsilon – Value added to the denominator for numerical stability
affine – If True, this module has learnable affine parameters
name – A name for the operation
- Returns
Parent layer after the normalization
Example:
l = LayerNormalization(l);
GroupNormalization
-
layer eddl::GroupNormalization(layer parent, int groups, float epsilon = 0.001f, bool affine = true, string name = "")
Group normalization layer.
Divides the channels into groups and computes within each group the mean and variance for normalization. The computation is independent of batch sizes.
- Parameters
parent – Parent layer
groups – Number of groups in which the channels will be divided
epsilon – Value added to the denominator for numerical stability
affine – If True, this module has learnable affine parameters
name – A name for the operation
- Returns
Parent layer after the normalization
Example:
l = GroupNormalization(l, 8);