Since version 0.15.0, dropout within MLP (multi layer perceptron) models is supported by Nkululeko, here's an example:
[MODEL]
type = mlp
layers = [128, 16]
drop = .5
meaning, that after each hidden layer a dropout probability of 50 percent is applied.
Since version 0.97.3, you can also assign individual drop-out rates per layer:
[MODEL]
type = mlp
layers = [128, 16]
drop = [.5, .2]