seo

Google Tells Us Why “Unnatural Link Warnings” Ask For SEO Services Used

A variety of smart SEOs have been debating why Google’s unnatural link warnings prompt webmasters for details on the SEO services they used. Of particular note is this excellent case study on Youmoz. 

At stake in the debate: lots of time savings and a fairly easy, predictable and effective way to buy rankings and income by paying blog networks to manufacture backlinks for you.

poker chips
Caption: Thinking critically is at the core of SEO and solves high-stake debates.

Obviously this competes with AdWords, because you’re buying the same traffic, just from different parties. And it makes Google look stupid. So it’s no surprise it’s against the guidelines.

(It can’t be about Google protecting users from user-unfriendly websites in the rankings because if the websites that ranked didn’t convert – the ultimate measure of user satisfaction – no one would be buying these links.)

The debate revolves around how Google uses the data.

Side 1: Google Gets the Data to Whack-a-Mole the Blog Networks’ Properties

At Warrior Forum in particular, but elsewhere too, webmasters claim that Google can’t be whacking these sites algorithmically, but based on webmasters’ reports.

1) If Google were penalizing sites algorithmically, why bother asking for the data from penalized webmasters? 

2) Furthermore, they argue, an algorithmic approach would make negative SEO  quite easy. (Negative SEO means violating Google’s guidelines on behalf of a competitor to get them penalized.)

3) Third, if the penalty is algorithmic, why haven’t 100% of the networks’ sites been hit? Therefore, Google asks for the data to whack as large a percentage of the networks as possible. It’s casting FUD (fear, uncertainty, and doubt) against a competing business model.

Side 2: Google Did it Algorithmically and Blog Networks are Dead

This side claims that Google may have bought a seed list of sites by using this technique, but once they identified the networks’ footprint, the rest was done algorithmically.

This doesn’t enable negative SEO because… well, people don’t say. This Warrior Forum post suggests it is possible.

From my experience with reinclusion requests and from hearing penalties at the keyword level, it appears that Google has minimized the possible impact of negative SEO and gives a site that comes clean its rankings back without too much delay.

The result is that the effort at dislodging a competitor is expensive, short-lived or even restricted to some keywords, and thus the ROI (compared to building or buying your own links) is dubious.

It’s quite clever: Google has used lateral thinking to carry over a rule from the world of web and software security. There, experts agree that you can’t 100% prevent getting hacked, but you can create measures to minimize the damage.

So why haven’t 100% of blog network sites been hit, if it was algorithmic?

Ah – that goes to the heart of the question: Why are Google’s spam hunters asking for data on the SEO services used by penalized webmasters?

Matt Cutts recently provided the answer to why Google seeks human spam report data, in December of last year, in the context of human quality raters.

The reason why Google wants details on SEO services used for “unnatural link building” is for quality control on algorithms. Like playing hot and cold, the data tells Google how accurate its algo is at detecting paid links. As Matt said: “Rule #1: Don’t muck with the data you use to evaluate algo quality.”

Which also tells us that the spam reports requested with unnatural link warnings probably are used to audit the algo quality without being “mole pointers” for the ongoing whack-a-mole game.

Finding this answer is another example of lateral thinking, one of the 7 principles of advanced SEO.

Some other useful / interesting items on human raters:

Whack-a-mole pic via Herman Sylvester.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button