Overview

Activation functions allow neural networks to learn complex, non-linear relationships. Without them, a neural network would just be a giant linear regression model, regardless of how many layers it has.

Common Functions

  • ReLU: The most popular for hidden layers.
  • Sigmoid: Used for binary classification.
  • Softmax: Used for multi-class classification.

Role

They decide which information is important enough to be passed to the next layer of the network.

Related Terms