BERT's understanding of context is much closer to ours
Eats, Shoots and Leaves by Lynne Truss is famous for explaining punctuation and pointing out how a simple comma can change the context of a sentence completely. If it’s hard for us, how hard is it for search engines?
BERT is Google’s latest search algorithm and it marks a huge step forward in understanding content and context, both in text and optimised video.
Crucially Google can now look both forward and backwards within a search query, building a greater understanding of the relationship between all words in a sentence or phrase.
Google announced BERT, their natural language processing model that they describe as the biggest change in 5 years and one that will affect 1 in 10 searches.
It uses AI to model the relationships of words within and across sentences, comparing the relationships of all the words in a sentence. Until now Google has looked at the words in a phrase and tried to work out the importance of each in relation to one another, but on a one by one basis. Joining words like ‘to’ and ‘and’ were ignored and just the group of remaining words were assessed to get their understanding of the precise query.
This step forward in semantic understanding of a search query is going to have a massive impact on SEO. It diminishes the importance of single key words, making searches much more dependent on content and context. It should benefits anyone producing and optimising good content, whether written content or video content.