Google has a long history of making refinements to their algorithms. These updates range from small tweaks to major shifts in the way the algorithms work and rank content. Given below is a small overview of all of the major updates Google has made to its search engine algorithms, and how they’ve affected the ranking of content:
The update that ushered in a new era of SEO. Google’s Florida update destroyed the rankings of those websites that used illegal techniques (such as keyword stuffing, hidden links, etc.) to boost their ranking on Google.
Panda updates and iterations: A series of Google algorithm updates that effectively put an end to “content farms” and websites of the like. Google continued to add new signals in each iteration in order to refine how Panda affected search results.
The Penguin update’s main task was to down-rank manipulative websites. Its main targets were spammy links.
Hummingbird was designed to go after websites that used keyword stuffing and low-quality content. It also brought about a change in the way Google interpreted search queries. It aimed to provide search results that targeted user intent rather than just the terms in the query.
Pigeon brought about an increase in the importance of SEO both on-page and off-page. It lead to a high amount of OffPage Linkbuilding penalties all around the world. A whole industry had to re-create itself afterwards. Due to our penalty-safety rules, none of our customers ever got penalized by Google.
Google’s Mobile update made it mandatory for websites to not only be optimized for mobile use but to also have a mobile version of their website. Pages that weren’t optimized for mobile were either down-ranked or removed from the search results.
As part of the Hummingbird algorithm, RankBrain uses machine learning to help Google understand the meaning behind the queries. RankBrain is an essential part of Google’s search engine algorithms, and it’s thought to identify the relevance of websites ranking for any given query.
Possum made a user’s location an important factor in the search results that users are presented with. It also provided more variety for queries that seem similar.
Low-quality blogs that were created only for generating ad revenue were the main targets of Google’s Fred update.
As we can see, Google has a long history of making refinements and updates to its algorithms. Some are small tweaks, while others can completely change the way Google ranks content. Google focuses on providing users with the best possible matches to their search queries, which is where Neural Matching comes in.
Neural Matching is Google’s new AI-based algorithm that affects the way Google’s search engine links a user’s search queries to the text on a particular webpage. Here, it is important to note that Google has not officially named the algorithm. Neural Matching may just be a codename for it. However, we do know what it does. According to Danny Sullivan.
“Neural matching helps Google better relate words to searches.” He also says, “For example, Neural Matching helps us understand that a search for ‘why does my TV look strange’ is related to the concept of “the soap opera effect.”
What this tells us is that Google is putting its focus on the content presented on a website rather than just the links and keywords.
A research paper from 2016 discusses Contextual Long Short Term Memory (CLTSM) models for natural language processing tasks. This paper might be related to what we are calling “Neural Matching.”
The paper highlights the following useful features of the algorithm:
This can be best compared to “smart replies” generated for replies to calls and texts.
The best example of this type of prediction is when you’re typing a sentence on your phone, and the software knows what you’re going to type next based on previous words in the sentence.
From the research paper, this is what most closely resembles Neural Matching. The paper describes this as predicting the topic of a response to a user’s query to understand their intent.
That last point is incredibly similar to what Google says that Neural Matching does, i.e. helping Google relate words to searches.
It’s important to note that Neural Matching might not be its own algorithm. It’s very possible that it is several algorithms working in tandem with each other and that Neural Matching is just the collective name given to all these algorithms working together.
In recent years, Google has updated its algorithms to have a better understanding of long-form content. Does this mean that Google now prefers long-form content over short form content? It’s unknown at this time.
However, websites and content producers experiencing drops in their search rankings should review content produced by high ranking sites to see whether there really is a difference in the rankings of sites that provide short-form content versus those that produce long-form content.
Google’s algorithms are fine-tuned to understand the meaning of content rather than just the keywords associated with said content. When writing about a topic, it is advised to stick to that particular topic without going on any tangents. When planning and writing content, it pays to be focused.
Google’s Neural Matching is less about what the technology is, and more about what it represents. Neural Matching means that it is not possible to generate search results based on user intent and page content rather than traditional methods like keywords and links.
In the end, it means that if a website wants to be ranked, the content producers will have to produce content that is relevant to user’s search queries.
Nowadays, Google does what’s called a rolling update, i.e., it refreshes its index daily. The big updates, however, can happen several times a year and are usually indicative of the way that Google improves its understanding of queries and content.