luni, 11 martie 2013

Testing: Moving Our Industry Forward

Testing: Moving Our Industry Forward


Testing: Moving Our Industry Forward

Posted: 10 Mar 2013 06:57 PM PDT

Posted by Geoff Kenyon

Over the past few years, our industry has changed dramatically. We have seen some of the biggest removals of spam from the search results, and a growing number of people are starting to focus on building quality links rather than just building links. Companies are starting to really invest in content, and sites are building better category pages and are improving their product descriptions. Content marketing is now a "thing." A big thing.

However, while all these changes are great, it seems as if we have stopped testing in order to adopt new ideas. While I know there are many exceptions to this generalization, I see the trend too often. Most SEOs work off of best practices, and while this is good, who can argue with having good page titles, headlines, copy, having crawlable paths to content, and building good links? We need to continue to refine these portions for the best results.

A great example of this sort of refinement is ranking factors research. A few years back, SEOmoz did some testing around H1’s vs. H2’s and said that the H1 doesn’t provide an added benefit. Whether or not you agree with this idea, this example shows how factors can (potentially) change over time.

Over the last few years, Google has rolled updates that have had significant impact on search: the canonical tag, href lang, and rich snippets/support for schema, just to name a few. While there have been tests on these updates, we need to continue to test and keep our knowledge current. Google is continually testing new things, and we need to rely on testing to keep up. For example, back when Search Quality Updates were a thing, Google would "share" names and descriptions of updates and tests to the search engine algorithm. Frequently, there were 30-40 updates a month that they were rolling out or testing.

As you already know, this means there is huge potential for a high number of changes to the algorithm. We need to be testing (new things and old) to make sure we’re staying current. 

Share your results

In addition to all the updates we are aware of, there is a lot that Google isn’t telling us. This is what makes testing and sharing even more important. Barry Schwartz pointed out on Search Engine Round Table that Google left some important items out of their August/September update. Further, there are updates that Google will deny. If it weren’t for people carefully watching and analyzing the SERPs and then sharing their tools (like Dr. Pete’s MozCast), we would probably be largely unaware of much activity.

If we don’t share our observations after testing, we face two problems. First, we can’t confirm and verify what we see (and believe), and second, we can’t move our industry forward. While the SEO industry is evolving and SEO is gaining more widespread acceptance, it is still seen by many as a mystery and a dark art. By sharing our tests and results, we educate the industry as a whole and raise not only the bar, along with our collective reputation. If we can retire bad practices and tactics that are of low-value, we bring more credibility to the industry.

Share your failures

We all want to conduct awesome, break through tests; it’s really exciting to learn new stuff. However, we have a tendency to only share our successes, rather than our failures. No one really wants to share failure, and it's natural to want to "save face" when your test doesn't go according to plan. But the fact remains that if there is a test that "fails," it isn’t a failure.

There is so much we can learn from a test that doesn’t go as expected (and sometimes we don’t know what will happen). Further, sharing the "failed" results can lead to more ideas. Last week, I posted about 302’s passing link equity. I began this test because my first test failed. I was trying to see if a page that was 302’d to another page would retain its rankings. It didn’t work, and the page I was testing dropped out of the SERPs, but it was replaced with the page on the receiving end of the redirect. This result led me to test them compared to 301s. On top of that, there was a really good comment from Kane Jamison about further tests to run to gain a better understanding. If I hadn't shared my "failed" results, I would have never learned from my mistakes and gained knowledge where I least expected it.

Below are a few other tests I've run over the years that ended up with "failed" results. I hope you can learn as much from them as I did.

Keyword research with Adwords

For this test, I needed to provide a comparison of the head vs. long term search volume related to tires. I had heard, at one point, that you could use Adwords impression data for keyword research. I decided to give it a try. I whipped up a rock solid domain and set up a broad match Adwords campaign. 

Tires^4!

(People even signed up!)

It didn’t work. While we got a lot of impressions, we couldn’t access the data. There was a category called “Other Search Terms” that contained all the impression data we wanted.

Lesson learned: Adwords impression data isn’t great for keyword discovery, at least in the capacity that we tried to use it.

Keywords in H2 tags

A few years back, I wanted to see if there was any advantage to placing an H2 tag around keywords in the content. The keywords were styled to look the same as the normal text; the only difference was the H2 tag. I rolled this out on about 10,000 pages and watched the results for a few months.

What did I find? Nothing. Exactly the same as the control group. Still, lesson learned. 

Link title element

This failed test is actually one of Paddy Moogan's. He wanted to test the link title element to see if that passed any value. He set the title to ‘k34343fkadljn3lj’ and then checked to see if the site improved its ranking for that term.

There was no improvement.

Later, he found out that Craig’s site was actually down, so it probably wouldn't be ranking regardless of how it was linked to. This brings up a really important point in testing: double check everything, even the small points. It can be really frustrating to run a test and then realize it was all for nothing. 

Your "failed" tests

We've all been there, so it's time to share your story. What have you recently tested that didn't turn out exactly how you planned? If we can all learn from the mistakes of others, we're in a better place. Drop a line in the comments and let us all know!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Niciun comentariu:

Trimiteți un comentariu