Google’s Panda Update
January 10, 2012Got hit by Google’s latest search algorithm release? Lost position of pages, sections of a site or the entire domain? It’s a familiar cry being sounded out by many site owners and SEO professionals alike. In a search world where Google reigns supreme, we have little recourse but to start being responsible content producers.
There are many authoritative sources that define what a Panda Update is, how it affects ranking in the SERPs, and even one who tests the Panda algorithm in an effort to identify the elements and thresholds of when the Panda penalties will take affect.
Tell Me What It Is Already
The short of it is: Google’s new algorithm is supposed to reflect the quality of a page based on human evaluators’ ratings, via machine learning. The argument is that traditional SEO has gotten to a point where everyone and their grandmothers know how to game the system, so there needed to be a better way to gauge and rank. The Panda Update to the algorithm addresses these suspects.
Rand Fishkin talks about how traditional SEO doesn’t cut the bill anymore, and that SEO efforts will have to go beyond simply checking things off a SEO checklist. So, while the pages on your website may in fact be well within Google’s guidelines for good content, it’s not good enough anymore. The Panda Update takes into account a user’s experience. And based on this “experience” signal, the experience signals amongst all the other pages on the domain, along with the traditional “SEO” signals – is how your rank and position will be dertermined.
Cut to the Chase
I’d like to think that I’m quite familiar with how human evaluations work, and what the scores are based on. I can tell you with a good amount of confidence that these are critical factors in determining the experience of a page:
- Too Many Ads – No one likes ads. This isn’t news. We understand ads help support the site, which I think most people can appreciate and accept that. However, there’s a fine line that shouldn’t be crossed, and Panda is drawing that line.
- Duplicate Content – All variations of the same anchor links (including textual ads) riddled throughout the content or layout. When an evaluator sees a page with content they’ve seen before; that’s tagged, too. (An evaluator can go through hundreds, if not thousands of pages a day, and are usually query-based, so the URLs are in the same “neighborhood”.)
- Content Below the Fold – Even if the page doesn’t have many ads, and there was no duplicate content, if the real content is being pushed down because of the ads, wigetized content, or other blocks that take up too much space – this will be tagged. This is considered obfuscation; pushing the content below the fold necessitates a user to scroll and otherwise search for the content, which creates bad user experience.
Those are, for the most part, the more important factors a quality evaluator takes into account when ranking websites. So according to Google, SEO is no longer a question of having algorithmic-friendly components on your pages, it’s beginning to be a question of value and experience to a user as well.
How To Recover
Already see a drop in traffic and/or clicks? This is likely because you’ve lost position to a competitor who provides a better experience and therefore has more value. Address the issues above and you’ll be on track to having things return to normal. Here are the actions I recommend:
- Don’t kill off the pages – it has already been indexed; Re-edit the content instead
- Re-evaluate the Ads you display – choose the most appropriate (relevant) category and providers from the ad networks
- Reconsider placement of those Ads and content – try to push the content up above the fold as much as you can.
Note: Regarding #3, “above the fold” may simply be a matter of content order within the markup. A skilled developer would be able to move the relevant content to the top of the markup.