Epoch — One complete pass through the entire training dataset. Training a model typically takes many epochs — 3 to 100+ depending on the task. Too few epochs means the model has not learned enough. Too many leads to overfitting. Monitoring loss per epoch helps decide when to stop.
Part of the XLUXX AI Encyclopedia — the most comprehensive AI reference on the web.

Leave a Reply