Big “G” Algorithm Updates Cheat Sheet..major Google penalties & algo changes

Here’s a few excerpts from an article I ran across and thought some may like to have a look at.

RankBrain is all over the Web, and yet it’s barely clear what it does and how it works.
Google rolls out algorithm updates once or twice every month (and that’s just the ones we know about!), but not all of them equally impact the SERPs. To help you make sense of Google’s major algo changes in the past years, I’ve put up a cheat sheet with the most important updates and penalties, how-tos on checking if you were hit by any given one, recovery advice, and prevention tips.

1. Panda

First launched: Feb 24, 2011
Rollouts: ~monthly
Nature: Penalty
Goal: De-rank sites with low-quality content

Google Panda is an algorithm used to assign a content quality score to webpages and down-rank sites with low-quality, spammy content. Initially, Panda was a filter rather than a part of Google’s core algorithm, but in January 2016, it was officially incorporated into the ranking algo. While this doesn’t mean that Panda is now applied to search results in real time, it might indicate that Panda rollouts will now happen faster and more frequently.
Every new Panda rollout means that sites previously hit may recover if they’ve made the right changes, and sites that escaped before can get caught and penalized.
Panda hazards

Google won’t disclose the exact factors Panda looks at when determining content quality. But based on what Google has said on the topic and the findings from SEOs around the Web, the following on-page factors can act as Panda triggers:

  • Duplicate content
  • Plagiarism
  • Thin content
  • User-generated spam
  • Keyword stuffing
  • Poor user experience

If you do locate a significant traffic drop that corresponds to a certain Panda rollout, chances are your site has indeed been penalized by Panda. The good news is, now that Panda is part of Google’s core algo, recovering should take less time than it used to; if you manage to fix the problems before the next Panda iteration, you will likely regain your rankings within a few weeks.

Run a Panda audit

To identify and fix Panda vulnerabilities on your site, follow the steps below.
1. Check for duplicate content across your site. Duplicate content on your own website is one of the most common Panda triggers. If you have a big site (>1000 pages), it’s recommended that you run regular content audits to make sure there are no duplication issues in place. For smaller sites, an occasional audit when you’ve added a bunch of new pages should be enough.
Whenever you’re adding new content to your site, make sure it’s available via one URL only, and does not replicate any of your existing pages.
If for some reason you can’t take down the duplicate pages, use a 301 redirect or canonical tag; alternatively, you can block the pages from indexing with robots.txt or with the noindex meta tag.
2. Check for plagiarism. External duplication is another Panda trigger. If you suspect that some of your pages may be duplicated externally on other online resources, it’s a good idea to check them with Copyscape gives some of its data for free (for instance, comparing two specific URLs), but for a comprehensive check you may need a paid account.
Many industries (like online stores with thousands of product pages) cannot always have 100% unique content. If you run an e-commerce site, try to use original images where you can, and utilize user reviews to make your product descriptions stand out from the crowd.
3. Identify thin content. Thin content is a bit of a vague term, but it’s generally used to describe an inadequate amount of unique content on a page. Often, thin content pages are filled with ads, affiliate links, etc., and provide little original value.
If you feel thin content could be a problem on your site, it’s a good idea to measure it in terms of word count and the number of total links on the page.

Mind that a "desirable" word count on any page is tied to the purpose of the page and the keywords that page is targeting. E.g. for queries that imply the searcher is looking for quick information ("what’s the capital of Nigeria", "gas stations in Las Vegas"), pages with a hundred words of content can do exceptionally well on Google. The same goes for searchers looking for videos or pictures. But if those are not the queries you’re targeting, too many thin content pages (<250 words) will very likely get you into trouble.
As for outgoing links, Google recommends keeping the total number of links on every page under 100 as a rule of thumb. So if you spot a page with under 250 words of content and over 100 links, that’s a pretty solid indicator of a thin content page.
4. Audit your site for keyword stuffing. Keyword stuffing is a term used to describe over-optimization of a given page element for a keyword. To figure out if there are keyword stuffing issues on your pages, it’s a good idea to look at your top ranking competitors’ pages.

5. Fix the problems you find. Once you’ve identified the Panda-prone vulnerabilities, try to fix them as soon as you can to prevent being hit by the next Panda iteration, or recover quickly if you’ve already been penalized.
Show how-to

2. Penguin

First launched: April 24, 2012
Rollouts: May 25, 2012; Oct 5, 2012; May 22, 2013; Oct 4, 2013; Oct 17, 2014
Nature: Penalty
Goal: De-rank sites with spammy, manipulative link profiles

Google Penguin aims to identify and down-rank sites with unnatural link profiles, deemed to be spamming the search results by using manipulative link tactics.
When a new Penguin update is released, sites that have taken action to remove the harmful links (such as through the Google Disavow tool) can regain rankings. New sites, and sites not previously caught, can in turn get trapped by Penguin.
Presently, Google’s looking to incorporate Penguin into their core ranking algo, and make it a real-time algorithm, which means that penalties will be applied faster, but recovery will also take less time.
Penguin hazards

Penguin is looking for sites with one or several of the following types of links in their profile:

  • Links coming from poor quality, "spammy" sites
  • Links coming from sites created purely for SEO link building (PBNs)
  • Links coming from topically irrelevant sites
  • Paid links
  • Links with overly optimized anchor text

Run a Penguin audit

Penguin uses a bunch of factors to identify spammy link profiles. Based on what Google has said on the issue and post-Penguin feedback from webmasters, SEOs now have a pretty solid idea of what these factors are.
2. Get rid of harmful links. Ideally, you should try to request removal of the spammy links in your profile by contacting the webmasters of the linking sites. But if you have a lot of harmful links to get rid of, or if you don’t hear back from the webmasters, it’s a good idea to disavow the links using Google Disavow tool, telling Google to ignore those links when evaluating your link profile. Disavow files can be tricky in terms of syntax and encoding, but SEO SpyGlass can automatically generate them for you in the right format.

3. Hummingbird

First launched: August 22, 2013
Nature: Ranking algorithm change
Goal: Produce more relevant search results by better understanding the meaning behind queries

Google Hummingbird is a major algorithm change that has to do with interpreting search queries, (particularly longer, conversational searches) and providing search results that match searcher intent, rather than individual keywords within the query.
While keywords within the query continue to be important, Hummingbird adds more strength to the meaning behind the query as a whole. The use of keyword synonyms has also been optimized with Hummingbird; instead of listing results with the exact keyword match, Google shows more theme- related results in the SERPs that do not necessarily have the keywords from the query in their content.
Hummingbird hazards

Hummingbird gives a ranking advantage to pages that provide valuable, original content, putting an end to old-school on-page tactics that bring no value to the user. The following features can get pages down-ranked in SERPs due to Hummingbird.

  • Exact-match keyword targetings
  • Keyword stuffing
  • Poor user experience

Check if you were hit

Like with Panda and Penguin, you can check if there where drops in your organic traffic after Hummingbird was released with Rank Tracker (free version is fine).

Adapt to Hummingbird

Hummingbird puts traditional exact-match keyword targeting in the past. Here are the steps to help you adapt your on-page strategy to Hummingbird.
1. Expand your keyword research. With Hummingbird, it’s a good idea to focus on related searches, synonyms and co-occurring terms to diversify your content, instead of relying solely on short-tail terms you’d get from Google AdWords. Great sources of Hummingbird-friendly keyword ideas are Google Related searches, Google Autocomplete, and Google Trends.

2. Discover the language your audience uses. It’s only logical that your website’s copy should be speaking the same language as your audience, and Hummingbird is yet another reason to step up your linguistic game. A great way to do this is by utilizing a Web listening tools to explore the mentions of your keywords (your brand name, competitors, industry terms, etc.) and see how your audience is talking about those things across social media and the Web at large.
3. Ditch exact-match, think concepts. Unnatural phrasing, especially in titles and meta descriptions, is still popular among websites, but with search engines’ growing ability to process natural language, it can become a problem. If you are still using robot-like language on your pages for whatever reason, now (actually, two years ago) is the time to stop.
Including keywords in your title and description still matters; but it’s just as important that you sound like a human. As a nice side effect, improving your title and meta description is sure to increase the clicks your Google listing gets.

4. Pigeon

First launched: July 24, 2014 (US)
Rollouts: December 22, 2014 (UK, Canada, Australia)
Nature: Ranking algorithm change for local search
Goal: Provide high quality, relevant local search results at the top of SERPs

Google Pigeon (currently affecting searches in English only) dramatically altered the results Google returns for queries in which the searcher’s location plays a part. According to Google, Pigeon created closer ties between the local algorithm and core algorithm, meaning that the same SEO factors are now being used to rank local and non-local Google results. This update also uses location and distance as a key factor in ranking the results.
Pigeon led to a significant (at least 50%) decline in the number of queries local packs are returned for, gave a ranking boost to local directory sites, and connected Google Web search and Google Map search in a more cohesive way.
Pigeon hazards

The following factors may become ranking disadvantages for local businesses after Pigeon:

  • Poorly optimized pages
  • Lack of quality backlinks
  • Improper setup of a Google My Business page
  • NAP inconsistency
  • Lack of a citation in local directories (if relevant)

Adapting to Pigeon

First and foremost, it’s important to understand that Pigeon only affects local searches, i.e. the queries for which Google displays different results depending on the searcher’s location. If you do local SEO for a business in the US, UK, Canada, or Australia, follow the steps below to ensure you meet Google’s local search guidelines.
1. Optimize your page properly. Pigeon brought in the same SEO criteria for local listings as for all other Google search results. That means you need to focus on on-page optimization and link building just as well, regardless of whether you’re doing local or international SEO.
2. Set up a Google My Business page. Creating a Google My Business page for your local biz is the first step to being included in Google’s local index. Your second step will be to verify your ownership of the listing; typically, this involves receiving a letter from Google with a pin number which you must enter to complete verification.
As you set up the page, make sure you categorize your business correctly — otherwise, your listing will not be displayed for relevant queries. Remember to use your local area code in the phone number; the area code should match the code traditionally associated with your location. The number of positive reviews can also have an influence on local search rankings, so it’s a good idea to encourage happy customers to review your biz.
3. Make sure your NAP is consistent across your local listings. Google will be looking at the website you’ve linked to from your Google My Business page and cross reference the name, address and phone number of your business. If all elements match, you’re good to go.
If your business is also featured in local directories of any kind, make sure the business name, address, and phone number are also consistent across these listings. Different addresses listed for your business on Yelp and TripAdvisor, for instance, may put your local rankings to nowhere.
4. Get featured in relevant local directories. Local directories, like Yelp, TripAdvisor and the like, have seen a major ranking boost after Pigeon. So while it may be harder for your site to rank within the top results now, it’s a good idea to make sure you are featured in the business directories that will likely rank high.

5. Mobile Friendly Update

First launched: April 21, 2015
Nature: Ranking algorithm change for mobile search
Goal: Display mobile-friendly pages at the top of mobile SERPs

Google’s Mobile Update (aka Mobilegeddon) is meant to give a ranking boost to pages optimized for mobile devices in mobile search, and subsequently, down-rank pages that are not mobile friendly. Desktop searches have not been affected by the update.
Mobile friendliness is a page-level factor, meaning that one page of your site can be deemed mobile friendly and up-ranked, while the rest might fail the test.
Mobile-friendly hazards

When evaluating a page’s mobile friendliness, Google looks at the following factors:

  • Overall mobile friendliness
  • Viewport configuration
  • Illegible content
  • Plugin use

Check if you were hit

A Rank Tracker will help you see if your site got de- ranked after the Mobile update.

Adapting to Mobilegeddon

The only way to adapt to Google’s Mobile update is, quite logically, to take your site mobile and ensure that it meets each Google’s guidelines for mobile sites.
1. Go mobile, cap. C’mon, it’s been a year since Mobilegeddon. There are a few mobile website configurations to choose from, but Google’s recommendation is responsive design. Google also has specific mobile how-tos for various website platforms to make going mobile easier for webmasters.
2. Take a mobile-friendly test. Going mobile isn’t all it takes – you must also pass Google’s mobile friendliness criteria to get up-ranked in mobile SERPs.

6. RankBrain

First launched: October 26, 2015 (possibly earlier)
Nature: Ranking algorithm change
Goal: Deliver better search results based on relevance & machine learning

RankBrain is a machine learning system that helps Google better decipher the meaning behind queries, and serve best-matching search results in response to those queries.
While there is a query processing component in RankBrain, there also is a ranking component to it (when RankBrain was first announced, Google called it the third most important ranking factor). Presumably, RankBrain can somehow summarize what a page is about, evaluate the relevancy of search results, and teach itself to get even better at it with time.
The common understanding is that RankBrain, in part, relies on the traditional SEO factors (links, on-page optimization, etc.), but also looks at other factors that are query-specific. Then, it identifies the relevance features on the pages in the index, and arranges the results respectively in SERPs.
RankBrain hazards

Though SEOs still know little about how RankBrain works, one thing we do know is that it looks for relevance features across webpages and somehow evaluates if those features are indicative of the page’s ability to match searcher intent. So in general, the following factors might shift your site down in SERPs after RankBrain:

  • Lack of query-specific relevance features
  • Poor user experience

Adapting to RankBrain

Though we’re still unsure which factors RankBrain takes into account when assessing webpages, one thing we do know is that RankBrain is helping Google provide a better search experience to users (Google’s mentioned that their metrics have improved significantly after RankBrain). Of course, we can only guess what the ‘metrics’ are. But think about it: how can a search engine evaluate user satisfaction? Which factors can a machine learning system use as indicators that it’s doing things right, and as data points for further learning? The logical answer is, by looking at user experience factors like SERP click-through rates, time on page, bounce rates, and pogo-sticking.
1. Maximize user experience. Of course, RankBrain isn’t the reason to serve your visitors better. But it’s a reason why not optimizing for user experience can get you down-ranked in SERPs.

2. Do competition research. One of the things RankBrain is believed to do is identify query-specific relevance features of webpages, and using those features as signals for ranking the pages in SERPs. Such features can be literally anything on the page that can have a positive effect on user experience. To give you an example, Searchmetrics’ research has shown that for ecommerce and health, pages with more content and more interactive elements are more successful.
While there is no universal list of such features, you can get a good idea of what they may be by analyzing the common traits of your top ranking competitors.
So those are the major Google updates to date, along with some quick auditing and recovery tips to help your site stay afloat (and, with any luck, keep growing) in Google search.

Source: Masha Maksimava

Marketing Manager/Copywriter at Link-Assistant