seo

6 Changes We Always Thought Google Would Make to SEO that They Haven’t Yet

From Google’s interpretation of rel=”canonical” to the specificity of anchor text within a link, there are several areas where we thought Google would make a move and are still waiting for it to happen. In today’s Whiteboard Friday, Rand details six of those areas. Let us know where you think things are going in the comments!

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Today, I’m going to tackle a subject around some of these changes that a lot of us in the marketing and SEO fields thought Google would be making, but weirdly they haven’t.

This comes up because I talk to a lot of people in the industry. You know, I’ve been on the road the last few weeks at a number of conferences —Β Boston for SearchLove and SMX Munich, both of which were great events —Β and I’m going to be heading to a bunch more soon. People have this idea that Google must be doing these things, must have made these advancements over the years. It turns out, in actuality, they haven’t made them. Some of them, there are probably really good reasons behind it, and some of them it might just be because they’re really hard to do.

But let’s talk through a few of these, and in the comments we can get into some discussion about whether, when, or if they might be doing some of these.

So number one, a lot of people in the SEO field, and even outside the field, think that it must be the case that if links really matter for SEO, then on-topic links matter more than off-topic links. So, for example, if I’m linking to two websites here about gardening resources, A and B, both about gardening resources, and one of those comes from a botany site and the other one comes from a site about mobile gaming, well, all other things being true, it must be that the one about botany is going to provide a stronger link. That’s just got to be the case.

And yet, we cannot seem to prove this. There doesn’t seem to be data behind it or to support it. Anyone who’s analyzed this problem in-depth, which a number of SEOs have over the years — a lot of people, who are very advanced, have gone through the process of classifying links and all this kind of stuff — seem to come to the same conclusion, which is Google seems to really think about links in a more subject/context agnostic perspective.

I think this might be one of those times where they have the technology to do it. They just don’t want to. My guess is what they’ve found is if they bias to these sorts of things, they get a very insular view on what’s kind of popular and important on the Web, and if they have this more broad view, they can actually get better results. It turns out that maybe it is the case that the gardening resources site that botanists love is not the one with mass appeal, is not the one that everyone is going to find useful and valuable, and isn’t representing the entirety of what the Web thinks about who should be ranking for gardening resources. So they’ve kind of biased against this.

That is my guess. But from every observable input we’ve been able to run, every test I’ve ever seen from anybody else, it seems to be the case that if there’s any bias, it’s extremely slight, almost unnoticeable. Fascinating.

Number two, I’m actually in this camp. I still think that someday it’s coming, that anchor text influence will eventually decline. Yet it seems to be that, yes, while other signals have certainly risen in importance, and there have been lots of other things, it seems that anchor text inside a link is still far more important and better than generic anchor text.

Getting specific, targeting something like “gardening supplies” when I link to A, as opposed to on the same page saying something like, “Oh, this is also a good resource for gardening supplies,” but all I linked with was the text “a good resource” over to B, that A is going to get a lot more ranking power. Again, all other things being equal, A will rank much higher than B, because this anchor text is still pretty influential. It has a fairly substantive effect.

I think this is one of those cases where a lot of SEOs said, “Hey, anchor text is where a lot of manipulation and abuse is happening. It’s where a lot of Web spam happens. Clearly Google’s going to take some action against this.”

My guess, again, is that they’ve seen that the results just aren’t as good without it. This speaks to the power of being able to generate good anchor text. A lot of that, especially when you’re doing content marketing kinds of things for SEO, depends on nomenclature, naming, and branding practices. It’s really about what you call things and what you can get the community and your world to call things. Hummingbird has made advancements in how Google does a lot of this text recognition, but for these tough phrases, anchor text is still strong.

Number three, 302s. So 302s have been one of these sort of long-standing kind of messes of the Web, where a 302 was originally intended as a temporary redirect, but many, many websites and types of servers default to 302s for all kinds of pages that are moving.

So A301 redirects to B, versus C302 redirecting to D. Is it really the case that the people who run C plan to change where the redirect points in the future, and is it really the case that they do so more than A does with B?

Well, a lot of the time, probably not. But it still is the case, and you can see plenty of examples of this happening out in the search results and out on the Web, that Google interprets this 301 as being a permanent redirect. All the link juice from A is going to pass right over to B.

With C and D, it appears, with big brands, when the redirect’s been in place for a long time and they have some trust in it, maybe they see some other signals, some other links pointing over here, that yes, some of this does pass over, but it is not nearly what’s happening with a 301. This is like a directive, and this is sort of a nudge or a hint. It just seems to be important to still get those 301s, those right kinds of redirects right.

By the way, there are also a lot of other kinds of 30X status codes that can be issued on the Web and that servers might fire. So be careful. You see a 305, a 307, 309, something weird, you probably want a 301 if you’re trying to do a permanent redirect. So be cautious of that.

(Number four):Β Speaking of nudges and hints versus directives, rel=”canonical” has been an interesting one. So when rel=”canonical” first launched, what Google said about rel=”canonical” is rel=”canonical” is a hint to us, but we won’t necessarily take it as gospel.

Yet, every test we saw, even from those early launch days, was, man, they are taking it as gospel. You throw a rel=”canonical” on a trusted site accidentally on every page and point it back to the homepage, Google suddenly doesn’t index anything but the homepage. It’s crazy.

You know what? The tests that we’ve seen run and mistakes — oftentimes, sadly, it’s mistakes that are our examples here — that have been made around rel=”canonical” have shown us that Google still has this pretty harsh interpretation that a rel=”canonical” means that the page at A is now at B, and they’re not looking tremendously at whether the content here is super similar. Sometimes they are, especially for manipulative kinds of things. But you’ve got to be careful, when you’re implementing rel=”canonical”, that you’re doing it properly, because you can de-index a lot of pages accidentally.

So this is an area of caution. It seems like Google still has not progressed on this front, and they’re taking that as a pretty basic directive.

Number five, I think, for a long time, a lot of us have thought, hey, the social web is rising. Social is where a lot of the great content is being shared, a lot of where people are pointing to important things, and where endorsements are happening, more so, potentially, than the link graph. It’s sort of the common man’s link graph has become the social web and the social graph.

And yet, with the exception of the two years where Google had a very direct partnership with Twitter and those tweets and indexation, all that kind of stuff was heavily influential for Google search results, since that partnership broke up, we haven’t seen that again from Google. They’ve actually sort of backtracked on social, and they’ve kind of said, “Hey, you know, tweets, Facebook shares, likes, that kind of stuff, it doesn’t directly impact rankings for everyone.”

Google+ being sort of an exception, especially in the personalized results. But even the tests we’ve done with Google+ for non-personalized results have appeared to do nothing, as yet.

So these shares that are happening all over social, I think what’s really happening here is that Google is taking a look and saying, “Hey, yes, lots of social sharing is going on.” But the good social sharing, the stuff that sticks around, the stuff that people really feel is important is still, later on at some point, earning a citation, earning a link, a mention, something that they can truly interpret and use in their ranking algorithm.

So they’re relying on the fact that social can be a tip-off or a tipping point for a piece of content or a website or a brand or a product, whatever it is, to achieve some popularity, but that will eventually be reflected in the link graph. They can wait until that happens rather than using social signals, which, to be fair, there’s some potential manipulation, I think that they’re worried about exposing themselves too. There’s also, of course, the case that they don’t have direct access. Well, they don’t have API-level access and partnerships with Facebook and Twitter anymore, and so that could be causing some of that too.

Number six, last one. I think a lot of us felt like, as Google was cleaning up web spam, for a long time they talked about cleaning up web spam, fromΒ ’06, ’07 to about 2011, 2012, it was pretty sketchy. It was tough.

When they did start cleaning up web spam, I think a lot of us thought,Β “Well, eventually they’re going to get to PPC too.” I don’t mean pay-per-click. I mean porn, pills, and casino.

But it turns out, as Matt Brown, from Moz, wisely and recently pointed out in his SearchLove presentation in Boston, that, yes, if you look at the search results around these categories, whatever it is — Buy Cialis online, Texas hold-’em no limit poker, removed for content, because Whiteboard Friday is family-friendly, folks — whatever the search is that you’re performing in these spheres, this is actually kind of the early warning SERPS of the SEO world.

You can see a lot of the changes that Google’s making around spam and authority and signal interpretation. One of the most interesting ones that you probably observed, if you study this space, is a lot of those hacked .eduΒ pages, or barnacle SEO that was happening on sub-domains of more trusted sites that had gotten a bunch of links, that kind of stuff, that is ending a little bit. We’re seeing a little bit more of the rise, again, of like the exact match domains and some of the affiliate sites and getting links from more creative places, because it does seem like Google’s gotten quite a bit better at which links they consider and in how they judge the authoritativeness of pages that might be hanging on or clinging onto a domain, but aren’t well linked to internally on some of those more trusted sites.

So, that said, I’m looking forward to some fascinating comments. I’m sure we’re going to have some great discussions around these. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button