Hyperparameter — Settings that control the training process — learning rate, batch size, number of layers, dropout rate. Unlike model parameters (weights) which are learned, hyperparameters are set by the developer before training. Finding optimal hyperparameters is called hyperparameter tuning.
Part of the XLUXX AI Encyclopedia — the most comprehensive AI reference on the web.

Leave a Reply