Overview

Backpropagation (backward propagation of errors) is how neural networks 'learn.' It calculates how much each weight in the network contributed to the error in the output.

The Process

  1. Forward Pass: Data flows through the network to produce a prediction.
  2. Loss Calculation: The error is measured.
  3. Backward Pass: The error is propagated back through the layers, calculating gradients.
  4. Weight Update: An optimizer (like SGD) uses the gradients to adjust the weights.

Importance

Without backpropagation, training deep neural networks would be computationally impossible.

Related Terms