Posted by Benjamin Estes
One of the first things to figure out for an SEO campaign is who your site is competing against in the SERPs. If you are consulting for a client, just asking what they know about their competitors might not be enough. Many clients will have a good handle on who their business competitors are, but these may differ substantially from the sites ranking for competitive terms. If you are running your own site, or are doing in-house SEO, you’ve probably spent a good deal of time getting comfortable with how strong the site is and for which terms it is ranking well. Either way, a more systematic approach to this preliminary competitive research can do wonders for prioritizing your SEO efforts.
Why Competitive Research?
It is true that competitive research can provide a survey of competitive link building tactics, many of which may be replicable for your own site. But more importantly, competitive research can show you which of your potential strategies is most likely to provide your site unique value, value that your competitors will probably have a harder time getting or which they seem to have neglected so far. For more on the nitty-gritty of competitive research, check Justin Briggs' guide to competitive backlink analysis.
A First Look At The SERPs
We all know what we’re here for: rankings! traffic! And rank is directly related to click through rate. We've all seen the pretty charts from place like Eyetools:
These eye-tracking studies are great for measuring usability, and they make really pretty pictures. But for what we're trying to accomplish here, some more concrete numbers will be useful.
In Kate Morris' blog post about predicting site traffic she cited a source of click-through rates from Chitika which I still use. There are probably more recent studies now; if you have numbers you trust more feel free to use them instead. The click-through percentage drops off very steeply as rank increases—note that the first position has about twice the click-through rate of the second:
It's easy enough to see which sites are in the SERP and which rank higher or lower than your own site—just load up the search page! But often a site will have multiple pages listed in the SERP. So, on a per-search-term basis, I like to add these together per-site. Take the following SERP, which I put in a spreadsheet to make it easier to work with (check out Tom Critchlow's post if you'd like to speed up your own result scraping). The search is "the clash guitar tabs":
So I would say to myself at this point that www.ultimateguitar.com is receiving 51.31% of the traffic for this search query (or 62.73% if I wanted to include tabs.ultimateguitar.com in the figure). On the other hand, www.guitaretab.com, though occupying three places in the SERP as well, is receiving 18.75% of the traffic. Simple enough?
Taking It To The Next Level
This is all very straightforward so far, but also intuitive, simple, and not exceptionally useful. But...
...what if, instead of restricting myself to a single SERP, I was to aggregate data from multiple searches and sum the click-through rate for each domain across these searches? Searches for "the clash guitar tabs" and "pink floyd guitar tabs" are listed below, one atop the other. I've highlighted www.ultimate-guitar.com and www.guitaretab.com for reference:
Using the magic of pivot tables I can then sum these values per-domain (if you need a pivot table refreshed, check out Mike's Excel for SEO guide):
The most powerful domains rise to the top of this list quickly. This is, of course, a very small data set. It is also a market that a few sites have dominated. If you want an interesting data set to practice this method with, try a market with many different brands ("vacuum tubes" works well—"svetlana vacuum tubes", "groove vacuum tubes", "ehx vacuum tubes" and so forth).
Get Creative
Once you've collected ranking data, you can organize it in any number of creative ways to navigate the data more intuitively—and hopefully make the data more actionable. Here is one of my favorite pivot tables, which shows how much strength each domain is receiving from results in each position (rank 1-10) in the SERPs:
This makes it easy to see which sites aren't meeting a certain threshold (e.g. never rank above position five), even though they show up in the SERPs frequently. You can also limit the list of sites in question to those with at least one page in the first page of search results.
Where Do I Go From Here?
There are many ways to tweak this process. You could use only results for hight-traffic terms, or only for long-tail terms. You could throw in a representative sample of both. I also like to get the standard devation from the average for each domain and set a threshold (e.g. any site greater than 2 SDs above the average is a competitor worth looking in to).
I’m sure there will be criticisms of this method based on my disregarding accurate search traffic numbers in my assessment. I know that the more perfect solution would seek out such figures, or other ways of assessing quantitatively how important individual search terms are. In fact, traffic aside, click-through rates can vary widely based on other factors like stacked results, mixed search results with maps and videos and images, and so forth. There is a trade off to be made, though, between the time this process takes and the power of the data it provides.
When I’m looking for a quick overview of the competitive landscape, I want my method to work fast so I can start digging into competitor’s practices and backlink profiles. Does it really matter exactly where my competitors stand against each other? Isn’t it enough to be able to find the top five or ten competitors quickly? I recommend finding the balance of detail and time, that you are comfortable with. A solid ROI, if you will.
Do you like this post? Yes No