Overview

Training a model usually requires many epochs. In each epoch, the model sees every example in the training set once and updates its weights accordingly.

Key Concepts

  • Underfitting: Too few epochs; the model hasn't learned the patterns yet.
  • Overfitting: Too many epochs; the model starts memorizing the training data.
  • Early Stopping: A technique to stop training when the model's performance on a validation set stops improving, even if more epochs were planned.

Related Terms