Google’s latest search algorithm BERT
A new algorithm for a better understanding of complex and natural queries
Google announced one of the biggest updates to its search algorithm in recent years. Using new neural network techniques to better understand the intentions behind the queries, Google says it can now offer more relevant results for about one in ten US searches in the English language (with support for other languages and locales that will appear later). For selected snippets, the update is already available worldwide.
In a world of search updates, where changes to algorithms are often more subtle, an update that affects 10 percent of search queries is a pretty serious task (and will certainly support SEO experts in the world at night).
Google notes that this update is best suited for longer conversational queries - and in many ways that is why Google would really like you to search these days because it’s easier to interpret the full sentence than the sequence of keywords.
The technology underlying this new neural network is called Transformer Bidirectional Encoder Views or BERT. Google first talked about BERT last year with open source code for its implementation and pre-trained models. Transformers are one of the latest advances in machine learning. They work especially well for data where sequences of elements are important, which obviously makes them a useful tool for working with natural language and, therefore, search queries.
This BERT update also marks the first time Google uses its latest Tensor Processing Unit (TPU) chips to provide search results.
Ideally, this means that Google search is now better able to understand exactly what you are looking for and provide more relevant search results and featured snippets. The update began this week, so most likely you already see some of its effects in the search results.