What the Google BERT Update Really Means for SEO
Recently Google publicly announced what they have dubbed to be one of “the most important updates” in the last five years. Sounds ominous, but what does that really mean for search results and SEOs?
In this guide, Infintech Designs will help you decipher Google’s algorithm so you can learn how to adapt and dominate the search results.
Table of Contents
Google’s Official Press Release / Statement
Officially Google stated about BERT that:
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning. Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
Ok, great, but what does all that mean?
What is BERT?
Bert isn’t just a cute name Google decided to give for the new update.
The letters actually stand for something. The BERT algorithm is known as “Bidirectional Encoder Representations from Transformers”.
This algorithm is deep-rooted in natural language processing fundamentals and is used for:
- Entity recognition
- Part of speech tagging
- Question answering; and
- Other natural language processes
In layman’s terms, BERT is meant to help Google’s search engine better understand what the words in a sentence mean, inclusion the multi-faceted nuances of context.
BERT’s Prediction Model
As part of BERT’s machine learning process, the algorithms takes pairs of inputs (queries or phrases) and learns based on data if the second sentence is a proper pair with the original.
In other words, it takes the first string of words and works to understand which webpage might make the most sense to continue that same line of thought or questioning the user had when they inputted the query.
In short, the algorithm works to better understand the intent and need or reason behind the search query, and even go a step further to potentially predict the current or future need that query will result in.
By doing so, Google hopes to better match results with the “why” behind each user’s search, helping them achieve their goals accurately, quickly and efficiently.
How Impactful with BERT Be in Regards to Understanding Search Queries?
Because BERT looks deeper into the context and meaning, strings of words need to be optimized in more precise ways so as to reduce any level of ambiguity.
Think of the word Apple. Depending on the context, this word could be referring to Apple the company or a piece of fruit.
BERT will also look at phrase combinations to better understand the meaning and intent behind the words.
What is the Extent of BERT’s Influence on Queries?
According to Google, BERT will initially only impact around 10% of all search queries.
Although 10% sounds minimal, it actually represents a very significant amount of searches.
While BERT may only impact 10% of searches “overall”, there will be sub-categories of search types where BERT may impact a substantially larger percentage (think local or buyer search phrases).
It is hypothesized that BERT may impact longer tail search phrases more heavily, as the need for context increases with query length.
Although long-tail phrases each individually make up a small search percentage, the cumulative total of them makes up a significant portion of Google’s user queries.
What Does BERT Mean for On-Page SEO?
The bi-directional nature of BERT should improve the understanding of context on-page, which is good for SEO and those businesses diligent enough to craft their content around a singular purpose while also providing semantically relevant supplemental information.
However, as for on-page SEO, there are a few things we can do to help this algorithm along:
- Emphasize the importance of key on-page concepts
- Provide clear on-page structure and hierarchy of content
- Aid in transforming unstructured data into structured data
- Proper internal linking
And more…
Because of how BERT works, we may begin to see an increase of number one results for queries that don’t even that the original “search string” in the content.
As Google becomes smarter it won’t be as much about optimizing for “exact match” keywords and phrases, but for the intent and purpose or goal of the query itself.
How Does BERT Affect Content Marketing?
By now you’ll likely notice a repeating theme and this is because the point needs to be driven home.
Relevancy and specificity as it relates to both the content of a query and the intent behind it will be paramount in 2020 and beyond.
Content marketing, as we all know, involves creating engaging, valuable and relevant content, published and promoted across a range of mediums in order to attract, and grow an audience.
BERT creates a unique opportunity for creative writers and content creators to “speak” in a more human way to their audiences without having to worry as much about the nuances of “optimization” for particular keywords or phrasing.
The best performing content under the watchful eye of BERT will be that which is specifically focused around a narrow topic, and that which answers the searcher’s question or intent quickly and efficiently while providing superior value as compared to the competition.
In the recent past, long-form content has ruled supreme.
With the introduction of BERT, many experts agree that we may begin to see a balance restored between content length that is sufficient enough to provide value and answers, without having to be exceedingly long.
What we expect to see with BERT is that content that is too long may not be narrowly focused enough to be as relevant to the query as it should be, while content that is too short or “thin” may not be seen as providing enough value to the end-user.
Closing Thoughts Regarding BERT
Currently, BERT has only been implemented for English based searches in the United States, but Google has commented that they expect to apply this algorithm in other locations and languages soon.
The major takeaway is that BERT is able to process, understand, and in some cases even predict the relationship of words in the context of a given phrase or sentence.
This enables the model to better handle “conversational” long-tail queries, better matching them to search results that satisfy the user’s intent or needs.
Although BERT and natural language processing models are still in their infancy, the future looks promising for these types of algorithms to dominate in the years to come.
As digital marketers, we can begin to prepare by modeling our SEO and content efforts to reflect what Google has said they want all along: high-quality content that is valuable and relevant to the user’s search query.