Overview
Released by Google in 2018, BERT changed NLP by looking at text in both directions (left-to-right and right-to-left) simultaneously. This allows it to understand the full context of a word based on its surroundings.
Pre-training Tasks
- Masked Language Modeling (MLM): Predicting missing words in a sentence.
- Next Sentence Prediction (NSP): Determining if one sentence logically follows another.
Impact
BERT set new records on many NLP benchmarks and is widely used for search, sentiment analysis, and question answering.