Model

Constructor

model eddl::Model(vlayer in, vlayer out)

Instantiates model, taking two vectors, one of input layers and another of output layers.

Parameters
  • in – Vector with model input layers, typically Input({…})

  • out – Vector with the ouputs of the model. Example: {Sigmoid(MyModel())}

Returns

Model instance

Example:

layer in1 = Input({3,32,32});
layer in2 = Input({1,32,32});
layer l = Concat(in1, in2);
...
layer out = Activation(Dense(l, num_classes), "softmax");
...
model net = Model({in1, in2}, {out});

Build

void eddl::build(model net, optimizer o, const vector<string> &lo, const vector<string> &me, CompServ *cs = nullptr, bool init_weights = true)

Configures the model for training.

Parameters
  • net – Model

  • o – Optimizer

  • lo – Vector with losses

  • me – Vector with metrics

  • cs – Computing service

  • init_weights – ‘True’ if the weights can be initialized to random values, else False (e.g.: Used when loading a pretrained model)

Returns

(void)

Example:

...
model net=Model({in},{out});

// Build model
build(net,
      sgd(0.01, 0.9), // Optimizer
      {"soft_cross_entropy"}, // Losses
      {"categorical_accuracy"}, // Metrics
      CS_GPU({1}, "low_mem") // GPU with only one gpu
);

Summary

void eddl::summary(model m)

Prints a summary representation of your model.

Parameters

m – Model to print

Returns

(void) Prints the model

Example:

...
model net=Model({in},{out});

// Build model
build(net,
      sgd(0.01, 0.9), // Optimizer
      {"soft_cross_entropy"}, // Losses
      {"categorical_accuracy"}, // Metrics
      CS_GPU({1}, "low_mem") // GPU with only one gpu
);

summary(net);

Result:

Generating Random Table
---------------------------------------------
input1                        |  (784)     =>      (784)
dense1                        |  (784)     =>      (1024)
leaky_relu1                   |  (1024)    =>      (1024)
dense2                        |  (1024)    =>      (1024)
leaky_relu2                   |  (1024)    =>      (1024)
dense3                        |  (1024)    =>      (1024)
leaky_relu3                   |  (1024)    =>      (1024)
dense4                        |  (1024)    =>      (10)
softmax4                      |  (10)      =>      (10)
---------------------------------------------

Plot

void eddl::plot(model m, const string &fname = "model.pdf", const string &rankdir = "LR")

Plots a representation of the model. This method simply calls the “dot” program (graphviz) with the given parameters. To know more about it, go to: https://graphviz.org/documentation/.

Parameters
Returns

(void) Plots the model

Example:

...
model net=Model({in},{out});

plot(net,"model.pdf");

Result:

../_images/mlp.svg

Load weights

Loads the weights of a model (not the topology).

void eddl::load(model m, const string &fname, const string &format = "bin")

Load weights to reinstantiate your model.

Parameters
  • m – Model

  • fname – File where the model weights are saved

Returns

(void) Loads the weights

Example:

load(net, "model-weights.bin");

Save weights

Save the weights of a model (not the topology).

void eddl::save(model m, const string &fname, const string &format = "bin")

Save weights of a model.

Parameters
  • m – Model

  • fname – File where the model weights will be saved

Returns

(void) Saves the weights

Example:

save(net, "model-weights.bin");

Learning rate (on the fly)

void eddl::setlr(model net, vector<float> p)

Changes the learning rate and hyperparameters of the model optimizer.

Parameters
  • net – Model

  • p – Vector with the learning rate and hyperparameters of the model

Returns

(void) Changes model optimizer settings

Example:

...
model net = Model({in}, {out});

// Build model
...

setlr(net, {0.005, 0.9});

// Train model
fit(net, {x_train}, {y_train}, batch_size, epochs);

Logging

void eddl::setlogfile(model net, const string &fname)

Save the training outputs of a model to a filename.

Parameters
  • net – Model to train

  • fname – Name of the logfile

Returns

(void) Outputs log to the given file

Example:

model net = Model({in}, {out});

// Build model
...

setlogfile(net,"model-log");

// Train model
fit(net, {x_train}, {y_train}, batch_size, epochs);

Move to device

Move the model to a specific device

void eddl::toCPU(model net, int th = -1)

Assign model operations to the CPU.

Parameters
  • net – Model

  • th – CPU Threads. (if ‘-1’, use all threads)

Returns

(void)

Example:

toCPU(net);
void eddl::toGPU(model net, vector<int> g = {1}, int lsb = 1, const string &mem = "full_mem")

Assign model operations to the GPU.

Parameters
  • net – Model

  • g – Vector of bools to set which GPUs will be used (1=on, 0=off)

  • mem – Indicates the memory consumption of the model. One of “full_mem” (default), “mid_mem” or “low_mem”.

Returns

(void)

Example:

toGPU(net, {1}); // Use GPU #1 (implicit: syncronize every batch and use 'full_mem' setup)
toGPU(net, {1, 1}, 100, "low_mem"); // Use GPU #1 and #2, syncronize every 100 batches and use 'low_mem' setup

Get parameters

vector<vtensor> eddl::get_parameters(model net, bool deepcopy = false)

Return a vector of vector of tensors with the parameters of each layer. These tensors are in CPU. Function transparent for distributed mode.

Parameters
  • net – Model

  • deepcopy – Whether the return vectors contains reference to the tensors or a copy of them

Returns

Vector of Vector of Tensors

Example:

vector<vtensor> get_parameters(net, true); // deep_copy=true

Set parameters

void eddl::set_parameters(model net, const vector<vtensor> &params)

Sets the parameters of the net. Function transparent for distributed mode.

Parameters
  • net – Model

  • params – Params to copy to the net

Example:

vector<vtensor> myparameters = ...
set_parameters(net, myparameters);