Rebecca posted an interesting post yesterday discussing whether we should optimise sites for different search engines. While creating pages with different keyword densities sounds like an idea that might work, it’s a lot harder in practice.
The technique I’ve used in the past is to change the way that my incoming links work by applying some simple cloaking to the linking sites.
For example, on one website of mine I might have a sitewide link to an affiliate site that only appears when Yahoo & MSN spider the site, but all other times the link is hidden. That way Google can’t get upset at me for cloaking and won’t see the unnatural sitewide links.
MSN & Yahoo, on the other hand, see a nice load of sitewide links and your site gets top rankings.
Another interesting technique is to cloak incoming links from sites you don’t own by passing them through a redirect script first. If the user agent is Google, send it to an internal page that’s probably not going to rank (or, if it’s a really spammy link, send it to Wikipedia or just break the link). If the user agent is Yahoo, MSN, or a real person, send them to the real page.
Some of you might be thinking that this is just a waste of links and that Google doesn’t hurt sites with lots of spammy or sitewide links. I’ve seen sites rank in MSN & Yahoo within weeks by throwing millions of sitewide links at them – try that with Google and you probably won’t rank for your own name.
This post is by Patrick Altoft from BlogStorm