BERT Google Update
Google is always making updates to their algorithm. In fact, each year they make thousands. Yes, thousands. They aren’t all as ground breaking as say the Penguin update. Most are tiny optimizations to the algorithm so Google can better function as a true answer machine. You can read all about the updates in Moz’s comprehensive overview here .
But chances are, you’re here because you’re interested in one of Google’s latest big updates. BERT affects 1 in 10 searches. So, what is the Google BERT update? BERT stands for Bidirectional Encoder Representations from Transformers. But that doesn’t help us much, does it? In layman terms, the update focused on understanding conversation language, determining word context, and deducing more accurate user intent.
Google is pretty good at it’s job. When you need an answer, you probably turn to Google. That said, if you have a specific longer complex question you don’t get the right results for on your first try, you might find yourself typing into the Google search box in a way that doesn’t exactly reflect your question. You probably take out prepositions and extra words that you think might confuse Google. In this update, Google set out to fix that. Using machine learning, the update ensures Google better understands long tail queries.
At its core, Search is about understanding language. It’s our job to figure out what you’re searching for and surface helpful information from the web, no matter how you spell or combine the words in your query. While we’ve continued to improve our language understanding capabilities over the years, we sometimes still don’t quite get it right, particularly with complex or conversational queries.
One of the biggest updates is to search queries that include “for” and “to” and other similar prepositions. With the BERT update, Google better understands the context of these small but mighty words and how they affect the intent of your search query.
You can see the example Google provided below. In this case, the word “to” is crucial to understanding the question. Prior to BERT update, the #1 search result did not answer the question at all because it was about US citizens traveling to Brazil. Now that Google better understands prepositions within your question and phrase, it provides a more relevant resource for you for traveling to the United States if you are from Brazil.
Other examples released by Google include phrases like “can you get medicine from someone pharmacy”. Before BERT, the top result was a resource all about getting a prescription filled. At that point, Google failed to understand that the intent of the user was specifically about picking up a prescription on behalf of someone else. With the BERT update, Google now understands “for” as a crucial element of the phrase.
These updates are also present on featured snippets. Google is now better at sampling the most accurate content and imagery to spear directly in search results.
As with all updates, the quality of your content is still the most important ranking factor. Neil Patel makes a great point in his blog post on BERT. He gives the example of someone searching for “how to lose weight without taking pills”. Without is the key word here, as we have mentioned previously in this post. However, Google is also able to deduce intent. If you are searching for ways to lose weight without taking pills, chances are you are equally uninterested in supplements, shakes, powders, etc. Now, Google is able to provide you with content that is more on base with your intent. It understands that a user searching that phrase is looking for more natural forms of weight loss.
How to Optimize for BERT
Well there is good news and bad news, and they are the same. You can’t exactly optimize for BERT. Think of BERT more like an improvement in Google understanding your content and the intent of users looking for the answers you provide. The best thing you can do to optimize for BERT is to write quality content that is for humans, not search engines. In this case, Google will handle the rest.
What is the BERT update?
- BERT focused on understanding conversation language, determining word context, and deducing more accurate user intent.
How many searches are affected by BERT?
- 1 in 10, or 10% of all searches.
Why am I not seeing a change in my website after BERT?
- If you are just tracking shorter keywords, you won’t see a change in rankings because BERT affects long tail keyword phrases.