Overview

Released by Google in 2018, BERT changed NLP by looking at text in both directions (left-to-right and right-to-left) simultaneously. This allows it to understand the full context of a word based on its surroundings.

Pre-training Tasks

  1. Masked Language Modeling (MLM): Predicting missing words in a sentence.
  2. Next Sentence Prediction (NSP): Determining if one sentence logically follows another.

Impact

BERT set new records on many NLP benchmarks and is widely used for search, sentiment analysis, and question answering.

Related Terms