Updated On: 2021-11-12 by Trevor Stolber
Name – BERT – Bidirectional Encoder Representations from Transformers
Referred to as: BERT or the BERT Algorithm or the BERT Update
Category: On Page SEO / Content
Correct Use: N/A There is not a correct or appropriate use of BERT.
Description:
BERT or Bidirectional Encoder Representations from Transformers,
This is actually a major advancement in search technology. I also want to recognize that although Google’s BERT implementation got a lot of attention, Microsoft was first to the punch.

Our take: BERT is a significant NLP (natural language processing) related technology that helps search engines understand the context of surrounding text. I envisage this as a moving average window over the text is it is parsed that allows relationships to be drawn and context understood. This is a significant advantage compared to general text or pattern matching.
This affected about 90 % of searches when it was implemented so it’s up there in the significance scale. Unlike many
Do: Business as usual. There is not much to optimize for BERT – just be aware and understand the nature of it and the trajectory of search technology.
Write for people – people are actually looking for your content, not search engines – always remember that!
Write shorter sentences. There is another player in the NLP algorithm space called SMITH which has better performance on longer sentences but for now, the advice is to keep em short!
Make sure you use plenty of white space.
Don’t: Optimize your content for BERT! Just write for people.
Don’t write an essay.
Tip: Be aware of search engine’s advanced text and language processing capabilities. SEO is not just matching keywords on a page.
Introduced: The BERT algorithm was introduced to Google in 2019 and it affected about 90% of searches, so no small updated.
What Google Says:
https://blog.google/products/search/search-language-understanding-bert/
What experts say:
https://www.searchenginejournal.com/google-bert-update/332161/