Over the past few years, our industry has changed dramatically. We have seen some of the biggest removals of spam from the search results, and a growing number of people are starting to focus on building quality links rather than just building links. Companies are starting to really invest in content, and sites are building better category pages and are improving their product descriptions. Content marketing is now a “thing.” A big thing.
However, while all these changes are great, it seems as if we have stopped testing in order to adopt new ideas. While I know there are many exceptions to this generalization, I see the trend too often. Most SEOs work off of best practices, and while this is good, who can argue with having good page titles, headlines, copy, having crawlable paths to content, and building good links? We need to continue to refine these portions for the best results.
A great example of this sort of refinement is ranking factors research. A few years back, SEOmoz did some testing around H1βs vs. H2βs and said that the H1 doesnβt provide an added benefit. Whether or not you agree with this idea, this example shows how factors can (potentially) change over time.
Over the last few years, Google has rolled updates that have had significant impact on search: the canonical tag, href lang, and rich snippets/support for schema, just to name a few. While there have been tests on these updates, we need to continue to test and keep our knowledge current. Google is continually testing new things, and we need to rely on testing to keep up. For example, back when Search Quality Updates were a thing, Google would “share” names and descriptions of updates and tests to the search engine algorithm. Frequently, there were 30-40 updates a month that they were rolling out or testing.
As you already know, this means there is huge potential for a high number of changes to the algorithm. We need to be testing (new things and old) to make sure weβre staying current.Β
Share your results
In addition to all the updates we are aware of, there is a lot that Google isnβt telling us. This is what makes testing and sharing even more important. Barry SchwartzΒ pointed outΒ on Search Engine Round Table that Google left some important items out of their August/September update. Further, there are updates that Google will deny. If it werenβt for people carefully watching and analyzing the SERPs and then sharing their tools (like Dr. Peteβs MozCast), we would probably be largely unaware of much activity.
If we donβt share our observations after testing, we face two problems. First, we canβt confirm and verify what we see (and believe), and second, we canβt move our industry forward. While the SEO industry is evolving and SEO is gaining more widespread acceptance, it is still seen by many as a mystery and a dark art. By sharing our tests and results, we educate the industry as a whole and raise not only the bar, along with our collective reputation. If we can retire bad practices and tactics that are of low-value, we bring more credibility to the industry.
Share your failures
We all want to conduct awesome, break through tests; itβs really exciting to learn new stuff. However, we have a tendency to only share our successes, rather than our failures. No one really wants to share failure, and it’s natural to want to “save face” when your test doesn’t go according to plan. But the fact remains that if there is a test that “fails,” it isnβt a failure.
There is so much we can learn from a test that doesnβt go as expected (and sometimes we donβt know what will happen). Further, sharing the “failed” results can lead to more ideas. Last week, I posted about 302βs passing link equity.Β I began this test because my first test failed. I was trying to see if a page that was 302βd to another page would retain its rankings. It didnβt work, and the page I was testing dropped out of the SERPs, but it was replaced with the page on the receiving end of the redirect. This result led me to test them compared to 301s. On top of that, there was a really good comment from Kane Jamison about further tests to run to gain a better understanding. If I hadn’t shared my “failed” results, I would have never learned from my mistakes and gained knowledge where I least expected it.
Below are a few other tests I’ve run over the years that ended up with “failed” results. I hope you can learn as much from them as I did.
Keyword research with Adwords
For this test, I needed to provide a comparison of the head vs. long term search volume related to tires. I had heard, at one point, that you could use Adwords impression data for keyword research. I decided to give it a try. I whipped up a rock solid domain and set up a broad match Adwords campaign.Β
(People even signed up!)
It didnβt work. While we got a lot of impressions, we couldnβt access the data. There was a category called βOther Search Termsβ that contained all the impression data we wanted.
Lesson learned: Adwords impression data isnβt great for keyword discovery, at least in the capacity that we tried to use it.
Keywords in H2 tags
A few years back, I wanted to see if there was any advantage to placing an H2 tag around keywords in the content. The keywords were styled to look the same as the normal text; the only difference was the H2 tag. I rolled this out on about 10,000 pages and watched the results for a few months.
What did I find? Nothing. Exactly the same as the control group. Still, lesson learned.Β
Link title element
This failed test is actually one ofΒ Paddy Moogan’s. He wanted to test the link title element to see if that passed any value. He set the title to βk34343fkadljn3ljβ and then checked to see if the site improved its ranking for that term.
There was no improvement.
Later, he found out that Craigβs site was actually down, so it probably wouldn’t be ranking regardless of how it was linked to. This brings up a really important point in testing: double check everything, even the small points. It can be really frustrating to run a test and then realize it was all for nothing.Β
Your “failed” tests
We’ve all been there, so it’s time to share your story. What have you recently tested that didn’t turn out exactly how you planned? If we can all learn from the mistakes of others, we’re in a better place. Drop a line in the comments and let us all know!