luni, 21 martie 2011

SEOmoz Daily SEO Blog

SEOmoz Daily SEO Blog


Defining Your True Competitors

Posted: 20 Mar 2011 02:05 PM PDT

Posted by Benjamin Estes

One of the first things to figure out for an SEO campaign is who your site is competing against in the SERPs.  If you are consulting for a client, just asking what they know about their competitors might not be enough.  Many clients will have a good handle on who their business competitors are, but these may differ substantially from the sites ranking for competitive terms.  If you are running your own site, or are doing in-house SEO, you’ve probably spent a good deal of time getting comfortable with how strong the site is and for which terms it is ranking well.  Either way, a more systematic approach to this preliminary competitive research can do wonders for prioritizing your SEO efforts.

Why Competitive Research?

It is true that competitive research can provide a survey of competitive link building tactics, many of which may be replicable for your own site.  But more importantly, competitive research can show you which of your potential strategies is most likely to provide your site unique value, value that your competitors will probably have a harder time getting or which they seem to have neglected so far.  For more on the nitty-gritty of competitive research, check Justin Briggs' guide to competitive backlink analysis.

A First Look At The SERPs

We all know what we’re here for: rankings!  traffic!  And rank is directly related to click through rate.  We've all seen the pretty charts from place like Eyetools:

These eye-tracking studies are great for measuring usability, and they make really pretty pictures.  But for what we're trying to accomplish here, some more concrete numbers will be useful.

In Kate Morris' blog post about predicting site traffic she cited a source of click-through rates from Chitika which I still use.  There are probably more recent studies now; if you have numbers you trust more feel free to use them instead.  The click-through percentage drops off very steeply as rank increases—note that the first position has about twice the click-through rate of the second:

Click Through by Rank 

It's easy enough to see which sites are in the SERP and which rank higher or lower than your own site—just load up the search page!  But often a site will have multiple pages listed in the SERP.  So, on a per-search-term basis, I like to add these together per-site.  Take the following SERP, which I put in a spreadsheet to make it easier to work with (check out Tom Critchlow's post if you'd like to speed up your own result scraping).  The search is "the clash guitar tabs":

The Clash Guitar Tabs Google Search

So I would say to myself at this point that www.ultimateguitar.com is receiving 51.31% of the traffic for this search query (or 62.73% if I wanted to include tabs.ultimateguitar.com in the figure).  On the other hand, www.guitaretab.com, though occupying three places in the SERP as well, is receiving 18.75% of the traffic.  Simple enough?

Taking It To The Next Level

This is all very straightforward so far, but also intuitive, simple, and not exceptionally useful.  But...

...what if, instead of restricting myself to a single SERP, I was to aggregate data from multiple searches and sum the click-through rate for each domain across these searches?  Searches for "the clash guitar tabs" and "pink floyd guitar tabs" are listed below, one atop the other.  I've highlighted www.ultimate-guitar.com and www.guitaretab.com for reference:

Using the magic of pivot tables I can then sum these values per-domain (if you need a pivot table refreshed, check out Mike's Excel for SEO guide):

The most powerful domains rise to the top of this list quickly.  This is, of course, a very small data set.  It is also a market that a few sites have dominated.  If you want an interesting data set to practice this method with, try a market with many different brands ("vacuum tubes" works well—"svetlana vacuum tubes", "groove vacuum tubes", "ehx vacuum tubes" and so forth).

Get Creative

Once you've collected ranking data, you can organize it in any number of creative ways to navigate the data more intuitively—and hopefully make the data more actionable.  Here is one of my favorite pivot tables, which shows how much strength each domain is receiving from results in each position (rank 1-10) in the SERPs:

This makes it easy to see which sites aren't meeting a certain threshold (e.g. never rank above position five), even though they show up in the SERPs frequently.  You can also limit the list of sites in question to those with at least one page in the first page of search results.

Where Do I Go From Here?

There are many ways to tweak this process.  You could use only results for hight-traffic terms, or only for long-tail terms.  You could throw in a representative sample of both.  I also like to get the standard devation from the average for each domain and set a threshold (e.g. any site greater than 2 SDs above the average is a competitor worth looking in to).

I’m sure there will be criticisms of this method based on my disregarding accurate search traffic numbers in my assessment.  I know that the more perfect solution would seek out such figures, or other ways of assessing quantitatively how important individual search terms are.  In fact, traffic aside, click-through rates can vary widely based on other factors like stacked results, mixed search results with maps and videos and images, and so forth.  There is a trade off to be made, though, between the time this process takes and the power of the data it provides. 

When I’m looking for a quick overview of the competitive landscape, I want my method to work fast so I can start digging into competitor’s practices and backlink profiles.  Does it really matter exactly where my competitors stand against each other?  Isn’t it enough to be able to find the top five or ten competitors quickly?  I recommend finding the balance of detail and time, that you are comfortable with.  A solid ROI, if you will. 


Do you like this post? Yes No

Link Building London: Absolutely Remarkable

Posted: 20 Mar 2011 12:14 PM PDT

Posted by randfish

I have attended an extremely large number of SEO-focused events in the past 7 years, planned and run seminars and training myself, contributed keynotes and sessions to major conferences and yet, I believe Distilled's Link Building day in London has just, in my opinion, grabbed the title of "best single day of content" ever at an SEO event.

In the past, I've been impressed in particular by the more expert-level shows like SMX Advanced, SES London and SEOmoz's own PRO Training (now officially MozCon), but this one took the cake. I'm, apparently, not the only one who thinks so:

 

I obviously can't give away all of the phenomenal tips and content from the day, but I can share a few of my notes:

  • Martin MacDonald of SEOForums showed the remarkable power of updatable, embedded widgets and how these can be remotely controlled to shift link locations, anchor text, etc. on the fly. He ran a specific experiment to show it off that had the crowd gasping and Tom Critchlow tweeting:
    _

    _
  • Wil Reynolds, founder of SeerInteractive explained a tactic his team's put to remarkable use over the years. First, check pages 10+ of the SERPs for your target keyword and you'll often find sites that have been abandoned or largely neglected. Use the top pages functionality from Open Site Explorer or Majestic to find the resources those sites built that earned links, then remake modern, updated versions on your site (or for your client). Now, simply contact the sites/pages that linked to the old version and presto - a huge opportunity for the revitalization of great content and a direct path to links.
  • Jane Copland, link specialist and SEO extraordinairre at Ayima talked about the problems with aggressive anchor text links, even if they come from good sources. She explained that risk can be mitigated by diversifying with branded anchor text and "click here" style links. Her recommendation, strange though it might seem, is to use the trusted, relationship links for non-optimized anchor text to help build a greater profile of trust. Even good links from partners and completely white-hat endorsements can look suspicious if they constantly use your top keywords as anchors.
  • Russ Jones of Virante looked deeply into how to leverage classic "linkbait" now that Digg has (mostly) died. His findings with regard to Reddit are remarkable - even a few votes on a medium-popularity subreddit have sent him 35K+ visits (which is as high as Digg in the past). What's more, though links are hard to come by, Facebook shares/likes and Tweets aren't, and these can help with rankings, too.

And there's good coverage from a few bloggers, too:

Tickets are still available for New Orleans this coming Friday, but sales close Tuesday, March 22nd around 12 noon Eastern (less than 40 hours away).

New Orleans Link Building

I can promise you'll get more value and link building goodness in that one day than anywhere else, and at a great value. I just checked Kayak, and many US cities have roundtrip rates under $400 for a ticket.

p.s. Just to be wholly transparent, while SEOmoz does not benefit financially from these events, we do have a long-term partnership with Distilled. I really do feel that this was the best day of content at a conference I've attended, including ones I've organized. Suppose I'll just have to step up my game :-)


Do you like this post? Yes No

Niciun comentariu:

Trimiteți un comentariu