Overview

In a standard Markov chain, the state is directly visible. In an HMM, you only see the 'outputs' or 'observations' generated by the states, but the states themselves remain hidden.

Example

You might observe someone's clothing (observation) to infer the weather (hidden state) if you are stuck in a windowless room.

Key Algorithms

  • Forward-Backward Algorithm: For calculating the probability of an observation sequence.
  • Viterbi Algorithm: For finding the most likely sequence of hidden states.

Applications

  • Speech recognition (before deep learning).
  • Bioinformatics (DNA sequence analysis).
  • Part-of-speech tagging in NLP.

Related Terms