In our industry we ask a lot of questions. How do I make better content? What can I do to build more links? Which seat can I take? But perhaps the most frequently asked question in the SEO community is regarding the nature of search engines and their algorithms.
SEOMoz does an incredible job of enlightening us through their research on search engine ranking factors. This topic is obviously of utmost interest to all of us, as posts about social signals in search and ranking correlations get a whole lot more thumbs up and comments than most others.
But the question I want to ask is not what are the rankings factors now, but what will be the ranking factors of the future? Search engines and their algorithms are evolving, and if our websites, content, links and strategies don’t evolve with them, well, then you still have invisible text on your pages and 10,000 keywords in your meta keywords tag. How’s that working out for you right now?
There are already articles about the future of search engine algorithms, but I would like to offer up my own speculation, based on experience, existing factors and common sense. I believe the future of search can be categorized by what I will call The Six V’s: value, validity, variety, vision, volume and visitors. That’s a mouthful. Let me explain what I mean by each one.
Value
The value of a website’s content is hard to determine but this factor will primarily be links. Right now we say links determine your trust, authority and relevance. All of these things say the same thing to me, and that is value.
Value can also come in the form of a brand. ESPN is a very valuable brand. Content that belongs to ESPN is therefore going to be valuable. This is like domain authority is now. The value of a page is determined by the links to that page; the value of a domain is determined by the links to all of the domain’s pages.
However, the way links pass value must be changed. For one, anchor text needs to lose so much weight as a ranking factor. It causes spammy links all over the place. And paid links will eventually be more penalized.
I don’t see link metrics changing a whole lot from where they are now, but I thought it would be easier to condense trust, authority and relevance into one metric simply called value.
Validity
There is a ton of crap online. A lot of what is being said is simply not valid. I think search engines are going to be able to analyze language to the point that they can use semantics to determine the validity of certain content.
For example, lose 100 pounds in just two weeks is not a valid claim. Make ten million dollars online in one month is not a valid claim to someone who does not currently have a website. Eating junk food all day is healthy and makes your heart stronger is not a valid claim.
One sign of valid content is if it is duplicated. If your content is quoted elsewhere, that is one way to validate it. If your content uses proper spelling and grammar, then it is more likely to be valid. Content that is written poorly, repeats keywords, has the same phrase in bold a lot and other related issues is probably not valid.
I don’t think we are far off from search engines being able to validate content. I think they do to an extent now in determining the quality of content. The Panda Update is a sure sign to us that Google is moving in this direction.
Another thing search engines likely already look at and will examine further in terms of validity is the existence of a privacy policy, terms and conditions, SSL certificate for e-commerce and the presence or lack of affiliate links. Sites with no privacy policy or disclosures and lots of affiliate links are typically not of the highest quality.
Variety
Is your content pretty much the same thing as your competitors? If you don’t offer anything different, why should search engines want to display your content? I think search engines will catch on to this idea very soon.
For example, I did a search in Google for “payday loans” and nine out of 10 results were websites offering me a payday loan. The other was Wikipedia. Why would I want nine identical websites with identical content and identical services in my top ten? What if I simply wanted to learn about payday loans and not necessarily get one?
Related to this would be what is in the search results. I had Google Places and Google News in my search results as well. Many searches return videos or images too. I think search engines are trending toward offering a variety of results for a search query.
The best way to show variety to Google is to be unique and different from your competition. Use multi-media in your content. Offer a service that isn’t out there. Do something that varies from the other millions of websites out there.
Vision
By vision I am referring to the actual design of your website. It’s long past time for search engines to start recognizing websites with such bad designs that it gets in the way of the content. I hate finding these websites. You all know the ones I am talking about. In case you don’t, here is a good example: Yale University School of Art.
Website designs like this one are usually the hallmark of a scam. Endless text. A million calls to action. Every sentence is a different font, color and size. You know the drill, we have all seen them. I am confident that at some point in the future search engines will be able to identify sites with designs that aren’t just merely ugly but are actually detrimental to the usability of the site.
Search engines do not want to display sites in their results that are impossible to navigate and use. I expect that soon they will do something about it and penalize sites with such bad designs that people will have trouble using them.
Volume
Let’s be honest, a website with valuable, valid content that adds 100 new pages a day is simply going to be perceived by users as a higher quality website. ESPN blows my college football blog out of the water.
I realize that not all websites are blogs. Many websites are static and do not add content. They may only consist of 10 pages and that is all. That’s perfectly fine. This metric, as with all others, is a comparative metric. That means the volume of your website compared to your competitors, not compared to all sites in general.
So if someone searches “Watches” in Google, I would think the site with 10,000 pages of watches would do better than the site with 20 pages of watches. This plays into the variety metric a bit as well.
I believe the key to this is simply to have as robust a website as possible, or at least more robust than the websites currently in the top ten. I also believe that having a blog as part of your website will be a critical element to increasing the volume of your website.
Visitors
Many have guessed, a few have hinted and contradicting remarks have been made when it comes to visitors being used as a rankings signal. If it is not already happening, it will soon. The massive amounts of data search engines have simply cannot be ignored.
This looks at things like how long visitors are on your site, how many pages they visit, bounce rate of pages viewed from search results, etc. I believe how users interact with your website will generally be used as a ranking signal.
Now this is an easy metric to spam, that’s for sure. However, I believe search engines will have safeguards in place to protect against that. They can throw out outliers and use other methods to detect someone unnaturally interacting with your website. They could look at repeat visitors and see something like “hey, this user visited that website every day for 30 straight days and never stayed for more than 2 seconds, must be a spammer.”
Search engines know a lot about your visitors, and I’m sure this information will be used in your rankings.
But what about social signals?
Good question, I’m glad I asked. Social signals will play into the value and validity of your content. Lots of social links will go a long way in showing the value of your content. And lots of Facebook likes will go a long way in validating your content.
So with all of this in mind, here are my predictions for the immediate future of search:
- The link metrics of anchor text and placement on the page will receive much less weight in the algorithm.
- Proper spelling and grammar will become big ranking factors.
- Search engines will start displaying more variety, diversity and unique content in their results.
- Usability, functionality and visual organization of a website will become a ranking factor.
- More fresh, robust information will be rewarded by search engines.
- User data will be incorporated into the algorithm.
I realize that some of these are or may be used by search engines now. In those cases, they will become bigger ranking factors than they already are. I believe as time goes on we will see less influence from traditional link metrics we’ve all been using for the last 5 years and see more influence from the factors named above.
What factors do you think will have more or less influence in the future?