Overview

The F1 Score is a useful metric when you want to find a balance between Precision and Recall, especially when you have an imbalanced dataset (e.g., 99% of cases are negative).

Why Harmonic Mean?

Unlike a simple average, the harmonic mean penalizes extreme values. If either Precision or Recall is very low, the F1 Score will also be low.

Formula

F1 = 2 * (Precision * Recall) / (Precision + Recall)

Use Case

Standard metric for evaluating classification models where both false positives and false negatives are important.

Related Terms