October was a big month for Google. On the 23rd, the search giant proclaimed that it had achieved “quantum supremacy,” a term used to indicate the creation of a quantum computer capable of solving problems and performing tasks that would take current supercomputers 10,000 years to complete. While some dispute Google’s claim to quantum supremacy, it remains a significant milestone, nonetheless.
Two days later, on October 25th, Google announced its latest algorithm update, dubbed BERT. According to the company, the BERT Google update is “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search.”
While this may not seem as consequential as quantum supremacy, for SEOs around the globe, that last statement is on par.
The fact is that keeping up with Google’s algorithm updates can be a full-time job, particularly when one considers that the company doesn’t always explain why fluctuations in the SERPs occur or how sites can adjust their tactics to meet the new standards.
While most of Google’s updates are done in the name of increasing SERP relevance by rearranging how indexed data are shown to searchers, this latest algorithmic alteration is slightly different.
What is the BERT Google Update All About?
BERT, an acronym for Bidirectional Encoder Representations from Transformers, is a leap forward for the engine’s natural language processing capabilities as the algorithm update is targeted toward enabling Google to understand words in the context of a sentence better.
However, while this update may seem like new ground for Google, it is actually not novel at all. Instead, BERT is part of a progression that has been taking place for many years.
As technologies and tactics for voice search optimization have become ever more commonplace, queries have grown increasingly more conversational. But this trend has been burgeoning for years, as has Google’s attempt to attune the SERPs to the behavioral shift.
In fact, it was the 2013 Hummingbird update that first began addressing the shift in search toward conversational queries. BERT is merely an extension, a continuation of this progression.
With that said, let’s take a closer look at the inner workings of the BERT Google update and how this will impact search moving forward.
How BERT Works
In technical terms, BERT is “a neural network-based technique for natural language processing (NLP) pre-training.”
In layman’s terms, BERT uses natural language processing and pattern recognition to understand the nuances of language better. Bringing these two technologies together enables Google to elevate its comprehension of the subtleties within human communication and reorganize the SERPs to feature more relevant results that more closely align with the searcher’s intent.
When speaking to the training process of the program, BERT not only is being trained for a contextual understanding of specific words in relation to other terms in a sentence but also for relationships between sentences as well.
For instance, in training BERT, the program receives pairs of sentences and then learns to determine if the second sentence is used in the original document from which the phrases are pulled. As is explained in Google’s AI Blog on BERT:
“Given two sentences A and B, is B the actual next sentence that comes after A in the corpus, or just a random sentence?”
Therefore, the BERT Google update aims to better understand the intent of the user and, when possible, predict the user’s needs.
In a world in which mobile and voice search usage are surging in popularity, users are increasingly seeking contextual answers for their queries and Google is aiming at becoming more refined in its analyzation of the intent behind those searches.
As a result, the Google SERPs are now being molded around the searcher’s intent instead of the keywords entered. Under this new paradigm, there will likely be exponentially more instances where keywords and inclusion of keyword variants in the query will not be present in the results pages because of Google’s ability to speak directly to intent.
This begs the question: What will this look like in practice?
As the BERT Google update announcement blog cites as an example:
“Here’s a search for “2019 brazil traveler to usa need a visa.” The word “to” and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil. With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query.”
For a human, it is clear that the query “2019 brazil traveler to usa need a visa” is about answering if someone coming from Brazil to the U.S. in 2019 requires a visa.
However, for computers, this query can be troublesome as older algorithms would omit the word “to” from the analyzation process. Through BERT, all the words are accounted for to understand the real intent of the search.
Similarly, another example that Google provides is the query, “can you get medicine for someone pharmacy?” Previously, the engine would have returned information on filling a prescription. Using BERT, Google now grasps the intent of the search is to know if a person can pick up a prescription for someone other than themselves.
This newfound ability, as dictated by the BERT Google update blog, will also have an impact on providing more relevant featured snippets.
One of the examples used for this is the query “parking on a hill with no curb,” where Google would have previously negated the word “no,” placing the emphasis on “curb,” thereby returning results for how to park on a hill with a curb. However, thanks to BERT, the engine now understands the importance of the word “no” in the context of the sentence and returns the proper information via the answer box.
Moreover, Google is using BERT to improve featured snippets in over two dozen countries across the globe, already seeing improved results in languages like Portuguese and Korean.
As can be surmised from these examples, BERT is geared toward understanding more complex queries that utilize long-tail keywords as opposed to short-head searches.
That said, Google has stated that BERT will impact one out of every 10 searches, making it the most significant algorithmic change since the introduction of RankBrain.
Given the scope of this update, many marketers are left to wonder how the BERT Google update will affect search engine optimization strategies and SEO overall.
How BERT Impacts SEO
While not explicitly stated thus far, it is essential to note that BERT does not necessarily impact how pages are ranked, but how the engine understands a query, thereby reorganizing listings to meet the user’s intent better.
Therefore, with the rollout of BERT, Google advises users to relinquish their keyword-oriented search tactics in favor of a more natural approach. That said, BERT’s backbone is reliant upon useful queries that will help the algorithm learn and get better at delivering relevant results.
Moreover, the change to the company’s algorithms is capable of gaining insights from one language and applying those teachings to another. For instance, improvements to the system in Spanish can potentially impact English results for greater relevance in relation to the query entered.
That said, the emphasis is always on the usage of language, meaning that poorly written content and websites will suffer under BERT.
When speaking to effective content marketing strategies and on-page SEO efforts, employing high-quality copy is more important than ever as BERT targets user intent. Therefore, any pieces that prioritize rankings over readers will continually diminish in usefulness.
As has been the trajectory of Google for years, the user experience is of prime importance. This means that the BERT Google update is intimately aligned with answering the search’s question as quickly and efficiently as possible, by supplying searchers with the most valuable content.
So, how can retailers meet the new demands laid down by the BERT update?
How to Optimize for BERT
The adage, “garbage in, garbage out,” deftly applies to the BERT algorithm.
What this means is the if retailers are producing poorly written content, they will obtain poor SEO results that will negatively impact their brands.
Given that the BERT Google update is focused on understanding natural language more proficiently so as to respond to the user’s intent aptly, it stands to reason that the more capable a content creator is at utilizing natural, relatable verbiage, the more prosperous SEO outcomes a site will receive.
However, it is worth noting that, much like with RankBrain, there is not a true methodology for BERT optimization. The advice that Gary Illyes handed down regarding RankBrain optimization was that sites should “optimize [their] content for users and thus for RankBrain.”
The fact is that BERT seeks to understand language in the same way that humans do, which means that when it comes to optimization, the best course of action is to generate conversational copy.
Moreover, given that searchers are likely looking to obtain specific information related to their query, content should be focused on a single topic that addresses particular questions concisely, while still providing considerable value overall.
Instead of writing tome-like blogs that fork in a myriad of ancillary directions, it is best to take deep dives into a singular topic, weaving in succinct statements, thereby optimizing for Google’s answer box.
It is important to understand that this does not mean that long-form content will fall by the wayside. This content will remain valuable as it is likely to contain the answers that directly addresses consumer intent. This is particularly true for those that utilize the skyscraper technique as a means of generating content that is more valuable than all other offerings on a specific topic.
Additionally, long-tail keywords, which long-form pieces include many, will continue to possess considerable power in the SERP. In fact, the weight of such terms could very well increase due to the BERT Google update being oriented more toward these types of phrases than short-head terms.
However, this is not the only reason for keyword research strategies to focus more on long-tail phrases. The fact is that consumers are researching topics and products more deeply than ever before. As Google grows increasingly sophisticated in its natural language processing abilities, long-tail terms will continue their assent as a result of people’s proclivity for product research.
Therefore, when considering on-page SEO strategies for product pages, landing page optimization, content marketing efforts and other tactics that revolve around convincing copy, it is vital that merchants make use of adept writers who can balance a conversational voice, persuasive writing techniques and knowledge of a given topic to create content that will lend itself to BERT’s aims.
However, in order to achieve these ambitions, retailers must first understand the intent of their audience before content marketing can help their eCommerce business in the SERPs.
For sellers to identify intent in search patterns, it requires a substantial commitment of time and energy and probably a significant overhaul of a brand’s keyword strategy. While this can be done, those with the resources should seriously consider hiring an eCommerce SEO agency that can devote the proper amounts of time and energy to sift through a company’s data and reorient the brand’s search and content strategy.
Other Useful Resources for Understanding BERT
The topic of deep learning and the future of search engine optimization is a complex and formidable task for most to wrap their heads around entirely.
Therefore, those who want to really dive into the topic to obtain the most comprehensive understanding possible should look into the research surrounding BERT.
For starters, those that need a bit more information on natural language processing are advised to read over an introduction to NLP for a more solid foundation.
From there, the original BERT paper contains everything retailers need to understand precisely how the BERT Google update works. That said, the writing is scholarly and will likely require the aid for the layman.
However, Keita Kurita has dissected the BERT paper, creating a much more readable version that is likely to be more helpful to those without the technical chops to dive into the academic version. Similarly, Rani Horev’s piece on BERT is also a great explainer of the original document.
Finally, since Google open-sourced BERT, the whole thing can be reviewed and experimented with through the materials over at GitHub.
Between this article and the pieces linked above, retailers should be able to procure a solid understanding of how natural language processing (and BERT in particular) are evolving and impacting the world of search.
The key takeaway about the BERT Google update is that the search giant is coming ever closer to understanding user intent through language analyzation. When it comes to ranking higher in Google, sellers must understand what their audience is looking for at various points in the buyer’s journey and provide resources that meet those needs, be they informational or transactional.
But, since there is no real way to optimize for BERT, the best thing to do is to adapt keyword strategies as they relate to intent and craft the best content possible.
While this might seem like something of a herculean feat for some store owners, know that you are not alone. If you need help figuring out how to direct your marketing efforts in the era of BERT, reach out to Visiture for a free consultation, and we can help reorient your company’s search strategy.