The parameters that configure machine learning algorithms are called meta parameters in contrast to the "normal" parameters that are learned during training.
But as they obviously also influence the quality of your predictions, these parameters also must be learned.
Examples are
- the C parameter for SVM
- the number of subsamples for XGB
- the number of layers and neurons for a neural net
The naive approach is simply to try them all,
how to do this with Nkululeko is described here
But in general, because the search space for the optimal configuration usually is without limit, it'd be better to try a stochastic approach or a genetic one.
Nkululeko exercise