Navigation

SEO sanity check part 1: Google’s Penguin and Panda updates

By Kerry Butters | How To, Marketing, Resources | May 10, 2013

SEO has always been a tricky business, not only do experts have to spend time on researching keywords and following the best practices, they have to be prepared for the changes which search engines inevitably put into place.

Last year saw search giant Google make two major algorithm updates — Panda and Penguin — that saw many a site plummet down the rankings, as they were penalized by the new rules.

This was because the changes were implemented in order to rank poor quality sites, such as content mills and link farms, down and give more weight to sites that produce quality content.

This is carried out by making changes to how Google’s spiders recognize a site, giving better rankings to sites with quality, well written content and social media engagement. For web professionals, this created something of a panic, as static sites that were not particularly well written and stuffed with keywords began to fail.

Penguin and Panda updates relied on a new set of rules and more complex algorithms designed to rank a site on a number of different factors.

These include:

  • Content: Google spiders can now tell if a site is badly written, with spelling and grammatical errors, lots of ads and bad quality links. This change is seen to be a welcome one for many SEO and digital professionals, as it immediately knocked poor quality article syndicates and content mills down the ranks, so that high quality could take their place and be more useful to searchers.
  • Freshness: the “freshness” of copy has become more important to Google than inbound links. This means that in order to compete on Google, it’s necessary to add new content often. The freshness ranking looks at 3 key areas: #1: Trending topics such as the Olympics or US election #2: Recurring famous events such as the Superbowl #3: How recently the content has been added.
  • Unique content: ever copied and pasted some content into a website to cut corners? Now it will also cut the site’s ranking. Original content is one of the most important aspects of determining position. Content containing unnatural links will also be penalized, so it’s important to ensure that links appear organically and are very relevant to the content. This is only going to become even more important as Google’s Author Rank takes off.
  • Social: as many of you will know, social is the new marketing and is a very powerful tool for SEO. Google now uses social in search results to determine just how useful a site is across the board. It’s important now for online marketers and SEO experts to include social, ensuring that all brand colors and logos are uniform across social channels and websites. Additionally, it’s important that the social presence is well managed; badly, bot-managed social will harm a site’s rankings.
  • Free from technical errors: this in particular is important for web professionals, and will no doubt knock a lot of blogging sites off the top perch. A site that has a sound architecture will perform better than a site which is built off templates, Flash, or is more than two years old. This means that code should be standards-based with valid CSS tags and tidy meta data.

 

How to address problems with a site’s ranking

Even some of the biggest sites were affected by the changes to Google algorithms, I read of one which had to be stripped right back in order to change all of the keywords and duplicate pages.

A site that is poorly written should have all of its content refreshed, preferably by someone who can write. This includes blog posts and articles, so if a site has lots of content like this, then it may be a better idea to strip it all from the site and add as you get it, or different content, written.

Meta data also has to be clean and tidy and Google tends to ignore keywords and concentrate on descriptions here. Keywords of course still have their place and it’s important to ensure that these are still well researched and analyzed, but articles and blogs with a high keyword density are likely to be penalized. This is because keywords, when overused, tend to compromise the quality of the writing.

Panda concentrated on getting rid of those sites which attempted to “trick” its algorithms with the overuse of keywords and link spamming. If you’ve determined that a site has spam links pointing at it, use Google’s Disavow Tool, which will remove them for you. However, it’s important at this point to note that a careful site audit should be carried out to identify bad links and it should be with caution that the tool is used.

For Panda, it’s also worth checking that a site’s content is unique; it has to be 60% unique site-wide, as well as accessible, in order to pass Panda’s rules.

Penguin concentrated more on the actual content and both algorithms are still updated regularly in order to refine them. For the most part, Penguin concentrates mostly on keyword stuffing within articles and spam links. 

Essentially, they are both concerned with accessibility, content, spamming techniques and new rules that are designed to prevent black hat SEO.

 

What is black hat SEO?

Basically, this is a way of attempting to manipulate the search engines so that it essentially ‘tricks’ them into thinking a site is valuable. Black hat uses aggressive tactics and is geared towards the search engine, rather than a human audience.

Over coming articles, I will take a look at black, white and grey hat techniques in order to give a clear overview of which can be used safely and which are a no-no. The problem that many have found is that some, less than reputable, SEO ‘experts’ have employed black hat techniques in order to win more customers and make a quick buck. This is why some business sites have dropped like a stone down the rankings, often unaware that they have done anything wrong.

Black hat techniques include:

  • packing code with ‘hidden’ text;
  • link farms where a group of sites all link to each other to spam the index of a search engine;
  • blog spam, using the comments field on blogs and forums to place links to other sites;
  • scraping, a practice where one site takes content from another in order to appear more valuable to search engines;
  • doorway pages used with the intention of enticing searchers with phrases not related to site content;
  • parasitic hosting , where a site is hosted on someone else’s server without permission;
  • cloaking, a technique in which the search engine spider sees different content to the end user who views through a browser.

Black hat methods are seen by many web professionals to be unethical, as they use tactics that promise swift returns but run the chance of damaging a company’s reputation, website and in turn, profits.

Utilizing black hat methods often means that a site doesn’t have to wait months for link backs, as you would with traditional white hat methods. However, it also fills the internet with useless information and spam, and over the years has seriously affected search.

It’s also cheaper for the SEO strategist to carry out as often, a blog network will already be set up to link to and it doesn’t depend heavily on analytics and content, as white hat practice do.

Not only does employing black hat methods often lead to the threat of legal action, if they are used alongside a PPC campaign, heavy penalties can be incurred from the advertising host.

It’s not recommended that a site use black hat techniques due to the penalties involved, in terms of legal action, reputation and the threat of not ranking. However, no doubt that won’t stop everyone, despite the Google updates.

Saying that, we’re already seeing content mills dropping rapidly down the rankings, so the updates are obviously working as this is one of the key areas that Google wanted to address.

Google and all of the major search engines have a vision, one that intends to clean up the web and do away with bad practices, leading to more useful content appearing at the top of search for us all. Whether you use black hat techniques or not is between you and your conscience, but certainly I for one am glad of the ability to search and not come up with a page full of junk before I get to what I want.

 

What problems have you run into as a result of Panda and Penguin? How have you solved black-hat techniques employed by predecessors? Let us know in the comments.

Featured image/thumbnail, search image via Shutterstock.

Share this post
Comments (no login required)
  • http://www.facebook.com/emosewamai Andrew Hersh

    I wish I could hug this article.

  • http://twitter.com/RaptoraUK Raptora

    Yet another invaluable article :)

  • http://twitter.com/kesbutters Kerry Butters

    This is something I address in an upcoming article CJ but for the time being, do a site audit using open site explorer [OSE], make a list of all the bad links including anchor text etc and add a rel=”nofollow” attribute to the <a> tag. Then it’s a case of letting Google know that you’ve done this, with a list of the bad links attached and a note saying they have been addressed. Not necessarily an easy process!

  • Clueless.com

    “the “freshness” of copy has become more important to Google than inbound links”

    Oh dear, oh dear.

  • http://www.facebook.com/ursula.robinson1 Ursula Robinson

    Great article thank you for sharing

  • http://www.facebook.com/NewYorkSEOservice James Simmons

    humm

  • http://www.facebook.com/the.ale.mello Ale Mello

    Nice post! Thanks!

  • meenu

    there is any panda update coming after penguin update or not ? reply me

    • Robin

      no i think

  • Robin

    Let me also for next update of panda and penguin, the article is good

  • Tosee Mihan

    thanks.

  • sonia

    Nice post to read. Thanks for sharing with us. Keep up the great work i’ll be visiting to your blog

  • http://www.seoupdates.xbriz.com/ Rajeev Kumar

    Very informative article on SEO. Thank you for this wonderful job.

  • Menshealth99

    Excellent article on what are leading black hat SEO techniques.

  • Rubina

    How to know that panda & penguin rules updated?? Is there any fix timeing for that or google mail us that new rules or we want to check every day in google i am little bit confused please tell me.

  • Joe Cabello

    “In order to determine if a site has spam links pointing at it, use Google’s Disavow Tool, which will remove them for you.”
    You might want to correct this. I think you misspoke. DIsavow tool will not determine which links are spam.

    • Benjie

      Quite right, I’ve corrected it. Thanks.

  • Macee Miller

    ive learnt so much in this gem of an article :) thankyou

  • http://dave-lucas.blogspot.com DaveLucasNotes

    Hello There! (And just who wrote this article? Butters or Benjie?)

    Mind you I am not patting myself on the back, but I have cooked up bunches of wonderful unique posts on my blog! You and your readers will never find them, not using Google as a search engine! My blog has ranked everywhere from a PR0 to a PR7 … it loaded slow as molasses and had an ugly template design and color scheme when it had that PR7!

    Nowadays you are penalized if you publish a guest post, while another site or blog gets penalized if you link to them, Google will slap you if you put an infographic on your blog…

    WHAT HAPPENED TO PLAIN OLD BLOGGING? And why are so consumed with how many followers we have and who “likes” our posts?

    I throw my hands up in the air! I am just going to blog as I always have, and speak out if I feel the need to (and of course I will back up all my posts in case they jettison my blog – they’ve done it to me and to others in the past!)

    BLOG ON, PEOPLE!

  • irkitated

    Thanks for the info. I lost about 80% of my daily hits in the past month and now most of my hits are coming from social media. I wish Google would play nice and stop messing with us.

  • Pinakin Darji

    right Jayesh Prajapti

  • http://www.fasttrackcreations.com/blog Nikhil Malhotra

    Very valuable.Correct and too the point.

  • http://www.squarefishinc.com/ SquareFish Inc.

    I shared the same thoughts when I was also hit by the updates. But the good thing is I have recovered and try to start anew. I’m still learning my lessons the hard way, but that would serve as my guide to not dwell into that bad habits ever again.