BERT — Bidirectional Encoder Representations from Transformers — Google’s 2018 model that revolutionized NLP. Unlike GPT which reads left-to-right, BERT reads in both directions. Dominated search ranking, sentiment analysis, and question answering. The foundation of modern search engines.
Part of the XLUXX AI Encyclopedia — the most comprehensive AI reference on the web.

Leave a Reply