What Is Google BERT? Experts Explain
Google BERT is an update to the search giant’s algorithm that had, and continues to have, a big impact on business.
If you understand BERT, you can get a leg up on the competition—and set yourself up for future search success.
To help you do that, this post provides a complete rundown of BERT and why it’s important. It also features advice from three experts we interviewed about the update.
What does Google BERT stand for?
Before we dive into what BERT is, you should know what BERT stands for.
BERT stands for “Bidirectional Encoder Representations from Transformers.”
These terms reference aspects of the AI-powered language models that comprise BERT. They’re also a mouthful, which is why Google abbreviated the name.
What is Google BERT?
But what is Google BERT exactly?
Google BERT is an AI language model that the company now applies to search results.
Though it’s a complex model, Google BERT’s purpose is very simple:
It helps Google better understand the context around your searches.
BERT uses AI in the form of natural language processing (NLP), natural language understanding (NLU), and sentiment analysis to process every word in a search query in relation to all the other words in a sentence.
In the past, Google used to process words one-by-one in order. The difference in results between the old and new approach can be dramatic.
Google offers the example of a search like “2019 brazil traveler to usa need a visa.”
In the past, Google would have interpreted this search as a US traveler looking for a visa to Brazil.
That’s because Google didn’t account for prepositions and context inherent in human language. In this example, Google would not have taken the word “to” into account. That changes the meaning of the search.
Compare that to BERT’s approach.
BERT takes the whole sentence into account, including prepositions. In this example, BERT now understands the searcher is a Brazilian looking for a US visa—not the other way around.
A lot of people use natural language to search for information. This language includes plenty of context clues that change search meaning.
Thanks to BERT’s NLP model, Google will now return information that better understands this context.
Google says the BERT model will have an effect 10% of all US searches, so it’s a big deal. And, the language model that powers BERT also have language understanding of non-English languages. So, expect its impact to be even bigger over time.
What impact will Google BERT have on SEO?
But what is that impact exactly?
How is BERT’s machine learning language model going to change your search results? What impact will it have on how Google’s algorithm assesses your content?
We asked three experts from top marketing technology companies for their take.
“The introduction of BERT is very similar, in terms of implications for marketers, to the release of RankBrain,” says Matthew Howells-Barby, Director of Acquisition at HubSpot.
“Ultimately, BERT will dramatically increase Google’s ability to better understand the context behind queries being made in the search engine, thus allowing them to better serve results that match intent.”
Businesses shouldn’t be looking for ways to game the new update.
“There’s no ‘optimizing for BERT’ in the same way that there was no ‘optimizing for RankBrain’ (despite the many articles claiming otherwise).
The result of BERT being introduced has simply meant that Google’s natural language processing engine has reached new heights and you can expect much more granular answering of queries. If you’re already optimizing for intent, you’re in a good place—this is the essence of what our Topic Cluster methodology is all focused on.”
Instead, you must prioritize consumer intent.
“BERT is Google’s next iteration in its long-running effort to better map search results to search intent,” says Lemuel Park, CTO at BrightEdge.
“It is Google’s neural network-based method for natural language processing (NLP) to understand queries that are more conversational based in nature.”
“This an algorithm change and not an update,” explains Park. “As a marketer this means increasing the specificity and depth of content and working farther into the longtail, or queries using more than three words.”
“Longtail keywords present an opportunity for marketers to develop content that has lower volume per page and less competition but necessarily requires more breadth in content development, a strategy and tactic that was built into the BrightEdge platform in 2014.”
His advice for navigating BERT? Focus on consumer intent.
“The best thing you can do as a digital marketer is to maintain and increase your understanding of the consumer intent through looking at conversational type search themes and keywords to ensure that you connect and have the right content to address the question or query optimized and on your website.”
To do that, you should create comprehensive, topic-rich content.
“The ability of BERT to better understand the intent behind long conversational-style search phrases, and surface relevant content, is a welcome development,” says Jeff Coyle, Co-Founder and Chief Product Office at MarketMuse.
“This is further evidence that marketers need to shift from keyword-driven articles towards crafting comprehensive and topically-rich content.”
But, Coyle cautions, BERT’s impressive capabilities are also its biggest disadvantage. The way BERT contextualizes sentences ends up using a lot of computational firepower, which could limit its use in products or services.
“With BERT, being a technology that creates ‘contextualized’ vectors, I will have to feed both sentences into the BERT network,” says Coyle. “That means I need to do to thousands of FLOPS as BERT is a deep neural network with many layers and neurons.”
“This capability of contextualizing makes BERT and it’s related models state-of-the-art on many natural language understanding tasks but it also makes it compute-intensive and hard to bring into production. That’s why it is powerful for a question or rewrite service but has limitations.”