Since version 0.15.0, dropout within MLP (multi layer perceptron) models is supported by Nkululeko, here's an example:
[MODEL]
type = mlp
layers = {'l1':128, 'l2':16}
drop = .5
learning_rate = 0.0001
meaning, that after each hidden layer a dropout probability of 50 percent is applied.