Adding dropout to MLP models with Nkululeko

Since version 0.15.0, dropout within MLP (multi layer perceptron) models is supported by Nkululeko, here's an example:

type = mlp
layers = {'l1':128, 'l2':16}
drop = .5
learning_rate = 0.0001

meaning, that after each hidden layer a dropout probability of 50 percent is applied.

Leave a Reply

Your email address will not be published. Required fields are marked *