Who has not dreamed of infiltrating one day in Google offices and finding the solution to understand all of their algorithms?
These algorithms decide whether your website or blog posts will rank first or last in the search results. They can make your business prosper or lead to its loss if its SEO is penalized.
Of course, you can not deliver information about the entire algorithm of Google. This secret is too well kept …
However, some parts of Google’s algorithm are regularly updated. In addition, knowing these algorithms will allow you to acquire a global vision of SEO, analyze the impact that some of these updates have and maybe also understand a little better what your webmaster really means. your SEO expert when they talk about penguins or pandas.
In addition, if you have noticed a sudden change in the traffic of your website or its referencing, there is a good chance that an update of a Google algorithm has had a positive or negative impact, on your website.
Use this list of Google algorithms and updates, sorted by deployment date, to understand the SEO logic:
Google Algorithms and Updates
This might surprise many, but Google’s ranking is not based on a single algorithm. In fact, several algorithms work in parallel.
All of these algorithms are used to improve the quality of search results to provide the best possible experience for users. Indeed, since the behavior on the internet evolves, the algorithms adapt themselves: new algorithms are created or modified according to the needs and the behavior of the users.
The names of the algorithms are not always easy to remember and one often wonders how some of them were chosen. Besides this mystery, this list will allow you to have a clear overview of the main algorithms of Google and their updates since 2010:
1 – Caffeine
Deployed in June 2010, Caffeine is a redesign of Google’s indexing system. The Caffeine algorithm allows you to crawl and then index a page instantly. Before the implementation of this algorithm, Google could not index a set of pages, after extracting, analyzing and understanding its contents.
With this new system that makes it quick and easy to index, Google is now able to provide 50% more content and articles in its search results than before the algorithm was implemented.
2 – Panda
The Panda algorithm is a search filter; introduced for the first time in February 2011, it penalizes the referencing of websites whose content is of low quality. Panda essentially aims to fight against content sites, created solely for SEO and spam.
The Panda algorithm is updated regularly to allow previously penalized sites to recover their SEO after improving the quality of their content, or on the contrary, to penalize sites that no longer comply with Google’s guidelines.
In its first deployment, Panda had a major impact on the configuration of search results by altering 12% of SERPs in the United States!
After the 4.2 updates, the quality of the content became a factor of SEO and the integration of Panda to the main algorithm of Google was confirmed in January 2016. Google now no longer announces the updates of Panda: This algorithm is constantly taken into account to define the ranking of a website in the pages of search results.
3 – Top Heavy – Excess Advertising
The Top Heavy algorithm was deployed in January 2012, in order to penalize the referencing of sites that are abnormally overloaded with advertisements, especially above the waterline.
However, this minor update had only a 1% impact on search results.
4 – Penguin – Penguin
The Penguin algorithm is the bane of webmasters, each update was very strongly discussed on the web and caused waves of panic and questioning on social networks. Experts will be able to confirm, with each fluctuation of the SERP, the SEO community was panicking fearing a new update of the algorithm.
Like Panda, this algorithm is also a search filter that was first introduced in April 2012. It penalizes the referencing of websites that do not comply with Google’s guidelines, in terms of creation, purchase or networks of links.
The webmasters penalized by Penguin had to clean their portfolio of links by disavowing the contentious links. If this cleaning was done correctly, they could hope to recover their original referencing at the next update. However, this tedious cleaning is not so simple: months or even years are sometimes necessary before we can hope to escape the penalties of SEO Google.
On September 23, 2016, at the launch of the 4.0 update, Google announced that this update would be the last. In a similar way to Panda, the Penguin algorithm has been added to Google’s core algorithm and is now working in real-time.
Now, monitoring the portfolio of links must be constant work to ensure a healthy link portfolio, which does not risk penalizing the referencing of certain pages.
However, the addition to the main algorithm of Google is good news because webmasters will no longer have to wait for a new update to be able to recover their referencing. Indeed, almost two years elapsed between the penultimate update of the algorithm and the deployment of Penguin version 4.0.
5 – Pirate
The Pirate algorithm is a search filter, deployed in August 2012. It aims to remove pages from search results, sites that have received complaints of copyright infringement, sent via Google’s DMCA system.
This filter is updated regularly to remove pages that offer illegal downloading of movies, series or music.
6 – Exact Match Domain (EMD) – Exact Match to Domain Name
The Exact Match Domain algorithm was deployed in September 2012. It makes it possible to avoid that low-quality sites are referenced in the first search results, simply because their domain name corresponds to a request strongly sought by the Internet users.
Indeed, the domain name has a strong influence on SEO and some webmasters had found a parade to improve their SEO, creating domain names optimized to excess.
For example, before implementing this algorithm, taking “www.software-marketing-pas-cher.com” as a domain name, there was a good chance that the homepage of this website would be referenced in the first search results for the query “Cheap marketing software”, even if the content of its pages did not necessarily meet the needs of Internet users. The deployment of this algorithm has made it possible to avoid such situations.
7 – Payday
This algorithm was deployed in June 2013. It aims to improve the quality of the search pages by removing the results for queries highly assimilated to spam (online gaming sites, adult content, credits, counterfeiting …).
8 – Hummingbird – Hummingbird
Hummingbird was deployed in September 2013. This algorithm is one of the most important of Google: it has had a strong impact on the way we formulate our research. Google chose to name this algorithm Hummingbird (hummingbird in French) because thanks to him, the search became precise and fast.
With this algorithm, Google can now understand a query or phrase as a whole and no longer based on one or a few keywords. The proposed results are therefore of much better quality and the research could become more humane, thanks to the understanding of conversational research.
Since the implementation of this new algorithm, it is possible to obtain precise answers for queries of the type: “Which is the nearest bakery ” or “Who is the doctor on duty today ” . This type of research was unthinkable before … Hummingbird would he opened the door to artificial intelligences and voice assistants such as Alexa or Siri? Perhaps.
9 – Pigeon
The Pigeon algorithm was deployed in July 2014, in the United States and in June 2015 internationally. This algorithm favors local search results to provide more accurate solutions to user queries. The changes made by this algorithm are visible on Google and Google Maps.
The Pigeon algorithm has had an impact especially on local businesses and businesses such as restaurants, bars or doctors’ offices …
10 – Mobile Friendly – Mobile Compatibility
On April 21, 2015, Google proceeded to deploy its Mobile-friendly algorithm that favors referencing mobile-friendly websites.
This algorithm had an even greater impact than those of Penguin or Panda and it was even renamed “mobilegeddon” by some SEO experts: the Armageddon of mobile compatibility.
This algorithm was deployed in real-time and page by page: a site could, therefore, maintain a good overall referencing, even if some of its pages were not adapted to the mobile format.
Since 2015, mobile compatibility is a priority for Google and a very important SEO factor. In fact, in November 2016, Google announced that it would launch its Mobile-first index in the course of 2017.
What is Mobile-first index? Until now, Google has been referencing websites according to their desktop version. But the behavior of users changes and they spend more time surfing the internet on mobile than on a computer. Google has decided to consider the mobile version of a website, to the detriment of the desktop version to perform its SEO.
If your website is not mobile-friendly, it’s time to make some changes. Check the mobile compatibility of your website by conducting a free trial on Website Grader .
11 – Rankbrain
Rankbrain, launched in early 2015 , is actually part of the Hummingbird search algorithm .
Rankbrain is rather particular and mysterious because it would be an artificial intelligence that would be able to understand the meaning of similar queries, but formulated differently.
For example, this artificial intelligence could understand over its learning, that the queries “Barrack” and ” Mari Michelle Obama” should provide a similar answer that is “Barrack Obama.”
In the extension of Hummingbird, Rankbrain aims to interpret and understand the most abstract searches of Internet users. More importantly, Google has stated that Rankbrain was one of the three most important SEO factors (with quality of content and links).
Rankin learning is applied to all searches, but it is done offline: Google feeds it with historical research files so that it learns to make predictions. These predictions are then tested and applied if they prove to be correct.
You did not think a simple Google search involved all that work up front, did it?
12 – Quality or Phantom
In May 2015 , the SEO world was panicking because many webmasters noticed significant changes in the SERPs. However, when members of the Google team, in charge of the quality of search engines, had been arrested on Twitter (as is very often the case), they replied that they had no stake to announce.
The webmasters convinced that something was happening, decided to name this update Phantom, due to the lack of response from Google but obvious signs of change.
A few weeks later, Google confirmed that an update was actually deployed and that it concerned the quality of the content of the websites. The Phantom update was renamed Quality by Google. However, Google never wanted to say how this update was different from the Panda algorithm.
Periodically, updates are noticed by SEO experts but denied by Google. There are therefore several versions of the Phantom algorithm , named for lack of a better name, by Phantom 1, 2 or 3. However, their importance, mechanism, and scope remain more or less unknown.
13 – Google doubles the length of descriptions
In November 2017 , Google doubled the number of characters displayed in the result descriptions from a limit of 160 characters to a limit of 320 characters.
With this update, Google continues to promote complete sentences and descriptions that contain enough information to provide context to readers. It is possible that the search engine does not take into account your meta-description tag and cut or complete some descriptions.
Reminder: Meta-descriptions do not count in search engine rankings, but are still essential to encourage your visitors to come to your site.
You now know a little better all the elements that can influence your search or referencing your website. However, this list is not exhaustive, I could also mention algorithms like Big Daddy, Florida or Bourbon but it seems useless to go back before 2010 because the internet has changed so much!