When Google opened for business, PageRank was innovative and produced the best of peer results because the landscape was clean(ish). Folks tended to link naturally. Once the magic behind PageRank became known, that level playing field went out the window and Google has been working to keep up ever since. That’s why people describe SEO as a Cold War-style arms race.
Skip forward a few years and we learned, through a leaked set of instructions, that Google uses humans to critique web documents and identify spam (though this is not to determine actual the ranking results). We also know that Google has a zealot’s zeal for doing things algorithmically. This tells me that Google is trying to build its algorithm by engineering formulas that replicate the results produced by their human evaluators, only en masse. (This is a personal opinion.)
It’s similar to the hunt and peck methodology that so many MBA research papers are made of. Run a study or conduct a survey, something quantitative. Run the results through a statistical engine. Look for the asterisks that indicate significant relationships. Find published journal articles that can be used to support the statistical results. Finally, write a paper with conclusions based on those statistical relationships.
You can be pretty certain that, among its activities, Google creates sample datasets, analyzes them statistically, looks for the key indicators of quality or lack thereof, then influences their algorithm based on those relevant indicators. In this analogy links are the bibliography and they are used to support the findings of the earlier statistical analysis. Those links that cannot support the analysis are discarded and those that do support the analysis are included. Add a weighting or sorting scheme and Google produces its SERPs.
Because the Web is not static, but ever changing, this is a bit like a dog chasing its tail. As quickly as Google implements a new set of standards, they must re-sample and repeat the process.
When you look at Google in this manner you understand some things:
- In a perfect world Google would only care about on-document and on-site ranking factors; this is what truly drives their rankings. This also explains why elements like TITLE can have such a big impact. Yes, links are important and can definitely outweigh other factors, especially in highly competitive sectors, but the real heart of Google analysis is focused on your content, in the future doubly so.
- Website administrators and marketers need to be forward looking and thinking. Think about reciprocal links as an example. Anyone who put thought into the evolution of search could figure out that mass or random reciprocal links would lose value, and that contextual links would gain value. The smart ones set aside time to build a base of high quality links and were not subject to violent ranking losses. Today viral marketing is hot, but smart people realize that a burst of links to a viral target does not honestly indicate that their home page or other important landing pages are best in class. The smart ones are thinking ahead.
- One should always work to improve non-archival content. Just as Google is always looking to improve their analysis, webmasters and content creators need to engage in constant learning and to inject their knowledge into their content. This should be applied to the content, document markup, visual presentation, and site architecture.
- While it would be nice to build and optimize documents and domains for a perfect world, to make progress you must build and optimize for the world that exists right now. That means understanding your target market, where they go, how they look for information, and what queries they put into search boxes. Then analyze the heck out of the things that matter to understand why the top websites are tops. Finally, emulate, outpace, and out-innovate those top websites.