Recurrent
RNN
-
layer eddl::RNN(layer parent, int units, string activation = "tanh", bool use_bias = true, bool bidirectional = false, string name = "")
Fully-connected RNN where the output is to be fed back to input.
- Parameters
parent – Parent layer
units – Dimensionality of the output space
activation – Name of the activation function
use_bias – Whether the layer uses a bias vector
bidirectional – Wether the RNN is bidirectional or not.
name – A name for the operation
- Returns
The RNN layer
Example:
l = RNN(l, 32);
You can check a full example on Advanced models.
GRU
Gated Recurrent Unit - Cho et al. 2014.
-
layer eddl::GRU(layer parent, int units, bool mask_zeros = false, bool bidirectional = false, string name = "")
Gated Recurrent Unit (GRU).
- Parameters
parent – Parent layer
units – Dimensionality of the output space
mask_zeros –
bidirectional – Wether the RNN is bidirectional or not
name – A name for the operation
- Returns
The GRU layer
-
layer eddl::GRU(vector<layer> parent, int units, bool mask_zeros, bool bidirectional, string name)
Example:
l = GRU(l, 128);
LSTM
-
layer eddl::LSTM(layer parent, int units, bool mask_zeros = false, bool bidirectional = false, string name = "")
Long Short-Term Memory layer - Hochreiter 1997.
- Parameters
parent – Parent layer
units – Dimensionality of the output space
mask_zeros –
bidirectional – Wether the RNN is bidirectional or not
name – A name for the operation
- Returns
The LSTM layer
-
layer eddl::LSTM(vector<layer> parent, int units, bool mask_zeros, bool bidirectional, string name)
Example:
l = LSTM(l,32);
You can check a full example on Advanced models.