Initializers

GlorotNormal

layer eddl::GlorotNormal(layer l, int seed = 1234)

Glorot normal initializer, also called Xavier normal initializer.

It draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.

Parameters
  • l – Parent layer to initialize

  • seed – Used to seed the random generator

Returns

The layer l initialized with the Glorot normal

Example:

l = GlorotNormal(l);

GlorotUniform

layer eddl::GlorotUniform(layer l, int seed = 1234)

Glorot uniform initializer, also called Xavier uniform initializer.

It draws samples from a uniform distribution within [-limit, limit] where limit is sqrt(6 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.

Parameters
  • l – Parent layer to initialize

  • seed – Used to seed the random generator

Returns

The layer l initialized with the Glorot uniform

Example:

l = GlorotUniform(l);

RandomNormal

layer eddl::RandomNormal(layer l, float m = 0.0, float s = 0.1, float seed = 1234)

Random normal initializer.

Parameters
  • l – Parent layer to initialize

  • m – Mean of the normal distribution to draw samples

  • s – Standard deviation of the normal distribution to draw samples

  • seed – Used to seed the random generator

Returns

The layer l initialized with a random normal distribution

Example:

l = RandomNormal(l, 0.0, 0.1);

RandomUniform

layer eddl::RandomUniform(layer l, float min = 0.0, float max = 0.1, float seed = 1234)

Random uniform initializer.

Parameters
  • l – Parent layer to initialize

  • min – lower bound of the uniform distribution

  • max – upper bount of the uniform distribution

  • seed – Used to seed the random generator

Returns

The layer l initialized with a random normal distribution

Example:

l = RandomUniform(l, -0.05, 0.05);

Constant

layer eddl::Constant(layer l, float v = 0.1)

Initializer that generates tensors initialized to a constant value.

Parameters
  • l – Parent layer to initialize

  • v – Value of the generator

Returns

The layer l initialized with a constant value

Example:

l = Constant(l, 0.5);

HeUniform

layer eddl::HeUniform(layer l, int seed = 1234)

He uniform initializer.

It draws samples from a uniform distribution within [-limit, limit] where limit is sqrt(6 / (fan_in )) where fan_in is the number of input units in the weight tensor

Parameters
  • l – Parent layer to initialize

  • seed – Used to seed the random generator

Returns

The layer l initialized with the Glorot uniform

Example:

l = HeUniform(l);

HeNormal

layer eddl::HeNormal(layer l, int seed = 1234)

He normal initializer.

It draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / (fan_in)) where fan_in is the number of input units in the weight tensor

Parameters
  • l – Parent layer to initialize

  • seed – Used to seed the random generator

Returns

The layer l initialized with the Glorot normal

Example:

l = HeNormal(l);