Overview

Early stopping is one of the most effective ways to prevent overfitting. It ensures that you don't train the model for so long that it starts memorizing the noise in the training data.

How it Works

  1. Monitor the loss (or accuracy) on a separate Validation Set during training.
  2. If the validation loss stops decreasing (or starts increasing) for a certain number of epochs (called 'patience'), stop the training.
  3. Save the weights from the epoch that had the best validation performance.

Related Terms