Overview

Unlike standard neural networks, RNNs have 'memory.' They process inputs one by one and maintain an internal state that captures information about what they have seen so far.

Use Cases

  • Natural Language Processing (text is a sequence of words).
  • Time-series forecasting (e.g., stock prices).
  • Speech recognition.

Limitations

Standard RNNs struggle with long-term dependencies due to the Vanishing Gradient Problem, which led to the development of LSTMs and GRUs.

Related Terms