BERT or Bidirectional Encoder Representations from Transformers is the neural network-based technique, which Google would be using for natural language processing or NLP. BERT was released as an open-source algorithm last year, with the aim to help computers understand different languages a little more as how we do. According to reports, BERT was rolled out globally in October 2019, and it is seen to have an effect on around 10% of all search queries.
Google said in their official blog post that BERT would help to interpret the tone and context of the words used by an internet user to search for a product or service. This, in turn, would allow the search engine to give better matching results for the queries. The algorithm would also be applicable to featured snippets, and not just the organic search engine results.
Explaining the upgrade to the search algorithm, Google said that BERT would make it much simpler for the search engine to come up with more relevant results. In an example, they said that when a user searches for “Brazilian traveler to USA needs visa”, it is very important to understanding the relationship of the word “to” with the other words to get the meaning. Earlier, the search engine was not able to understand this connection, which means that it would have returned results about people traveling from the US to Brazil. “With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query,” Google explained.
Experts say that BERT is the next big thing after RankBrain, Google’s first AI algorithm to understand search queries. Google also admitted that the [BERT] update is “representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.”
RankBrain works by analyzing the search queries as well as the content on the web pages to return the most applicable results to the user. BERT is not here to replace RankBrain, but include a further advanced way of understanding web content and search queries. As this would be like a stabilizer to Google’s web page ranking system, it is now even more important for the SEO companies to create website content keeping the objective of the user in mind.
Note that just as an SEO company cannot actually optimize for RankBrain, Google said that there is no plausible way to optimize for BERT either. However, as the algorithm makes the search engine understand natural language in a much better way, the best Search Engine Optimization strategy to benefit from it would be to continue writing quality web content focusing on user intent and relevance.