Panda and Penguin: Google ranking algorithm updates

It’s the big news in SEO – recent changes to Google’s search ranking algorithms named Panda and Penguin.

Over a three week period in April 2012 these major iteration updates or ‘filters’ to the algorithm have affected how certain links and sites are perceived, and has caused some sites to slip or even completely lose their positions in Google search engine results pages (SERPs). So what are they in simple terms?

 

panda google updateGoogle Panda

Panda is thought to be a change to how the algorithm assesses on-page factors and usability – so that’s things like:

                  • · quality
                  • · depth and duplication of content
                  • · prevalence of adverts
                  • · ‘overoptimisation’
                    (trying to hard to optimise for search engines and forgetting users)

Sites that scrape or copy content from elsewhere, produce ‘thin’ or low value content on a grand scale, are heavily templated/database generated content or sites that are plastered with Adsense adverts are all types of sites that have been ‘hit’ by the update.

 

penguin google update

Google Penguin

Penguin has been considered an equivalent filter to to Panda but targeting offsite factors – so that’s things like:

                  • · links and low link quality
                  • · link networks
                  • · social and comment link spam
                  • · footer and sidebar (not content) links
                  • · sitewide links
                  • · overall site link profiles
                  • · anchor text

Although there’s not full agreement about what the change actually means (the only way we can know is by looking at sites that have been hit or devalued), the consensus is that Penguin has worked with the Panda update to really hit the value of links from spammy, low value sites, and has reduced the value of ‘unnatural’ links such as footer and sitewide links i.e. links that are not relevant or in context.

 

It also seems to be looking at overall site link profiles, with sites that have a prevalence of low-value links (low PageRank or DomainAuthority) with few higher quality also being affected.

It’s been proposed that anchor text (the words that are linked from) has been devalued or become a significantly less important part of the algorithm, with Google perhaps now being far more educated about what a site is about that it doesn’t need to rely on the anchor text words to spell it out for it.

It’s also highly likely that unnatural anchor text profiles i.e. sites that have unvaried, mostly keyword rich (overoptimised) anchor text and little brand anchors have been negatively affected.

 

read more

If you want a bit more insight I’d recommend you read the excellent Penguin and Panda discussion involving a load of top SEO pros collected by CognitiveSEO.

 

what’s it all for?

The overall aim of these updates is to reduce web and link ‘spam’, so that low quality links and low value sites are reduced or even removed from consideration of the algorithm and therefore search results. Has it been an total success? No, there are many instances of sites being penalised that perhaps shouldn’t have been, and sites dropping because of more ‘unfortunate’ than spammy practices.

However, with consistent iterations and updates it’s likely to really reduce the effect of spammy and low-quality SEO techniques – which can only be a good thing.

 

 

Images from the ever-excellent NatalieDee.com

If you need any help in doing "digital" better don't hesitate to contact us.

Want to hear more from Attacat?

Sign up to our newsletter and receive our latest articles.

View our latest newsletter here.