vineri, 1 aprilie 2011

SEOmoz Daily SEO Blog

SEOmoz Daily SEO Blog


Introducing FutureRank BETA - A Rank Predictor With Surprising Accuracy

Posted: 01 Apr 2011 02:18 AM PDT

Posted by adamf

I'm pleased to announce SEOmoz's latest tool, FutureRank, is now available in beta and is free to anyone for the the next 48 hours (afterward it'll be PRO only). FutureRank attempts to predict what might seem impossible: how your pages will rank next week, next month and next year. Before you dive in and try this exciting tool yourself, it's important you understand how it works, and it's limitations.

You start by entering a keyword, the URL of a page that currently ranks for that keyword, and the approximate current rank. (Why approximate? Because most of the prediction is based on the data in our existing data store; what you enter is used a start point for our prediction models.)

FutureRank

Tool

After clicking the button, it will take about 30 seconds to run the prediction model (we'll try and keep you entertained while you wait). After the calculations are completed you'll be presented with the predicted ranks for next week, next month and next year, and the corresponding level of confidence.

Result

I know your wondering how could we be so confident with predictions so far in the future? Well, it depends on one big assumption that it's important to understand before using the tool.

The accuracy depends on one big assumption

The FutureRank tool runs with one big assumption: that the SEO activities you've been performing for a given keyword and page will remain stable over the next 12 months. What does this mean? To use the Zappos screenshots above as an example: let's pretend Zappos has been spending about 10 hours each month to try and rank for keyword Yellow Shoes (not likely, but let's pretend). The tool assumes that Zappos will continue that same level of effort of 10 hours a month for the next week, month and year. For the tool to be accurate your level of effort towards optimization for a given keyword / page must remain constant. It doesn't matter if that's no effort put toward optimization or 40 hours a week, it just needs to remain constant.

The accuracy of the model also depends on the keyword because we have a varying amount of data for our each keyword that's entered. Despite this, we were shocked that keywords that our system has never seen yielded surprisingly high accuracy-- our brilliant engineers have told us they were able to approximate changes to a given SERP by analyzing root words in a given keyword phrase. The tool is less accurate however, with one word keywords that our system has never seen and the confidence scores in the tool will reflect this.

We've been testing the tool internally for the past few months and have been quite surprised by the accuracy. In the cases where the tool incorrectly predicted the rank, it was often keywords that we had been ignoring and began optimizing soon after making the first prediction, but after a month of optimizing we found that the tool adjusted for this change and became more accurate when we ran a prediction a month later.

Free access for the next 48 hours, limited to PRO afterwards

We'd like everybody to have an opportunity to try the tool and provide feedback. After 48 hours, the tool will only be accessible for SEOmoz PRO members. Please give it a try, and let us know what you think using the Feedback link the left side of the screen.

Try Out Future Rank

How does FutureRank work?

It may seem like we're using a time machine to make these predictions, and our design team had a little fun with that idea. While I'd love to say that SEOmoz has harnessed the power of space and time, it's actually not as complicated as you might think-- it just requires a lot of data - no flux capacitor required.

Between crawling the web to create our Linkscape index and monitoring aggregate performance data for tens of thousands of websites, SEOmoz collects a large set of data on the web's link graph, ranks, traffic and the composition of a wide range of SERPs. During a brainstorm session on how we could use this valuable data to create new tools for our PRO members, Cyrus from the customer team, jokingly suggested we create a tool to predict the future rank of a given web page.

While most of us chuckled at the idea, a few of our engineers began looking at our data and creating some simple prediction models. Within a few weeks they had developed an internal alpha tool that was moderately accurate and after a few months of tuning to the prediction model to real results we thought it was time to release the beta to the Moz community.

In short, our prediction model is based on analyzing the prior ranks of both your page, and the other pages in the SERP, the Moz metrics numbers over time (Domain Authority, Page Authority, MozRank, MozTrust), and machine learning models of the search engine's ranking factors. In the next few weeks we'll post a more detailed explaining more details about the prediction model. 

Beta limitations

  • The tool currently requires you to enter the approximate current rank of the page (in a future revision we'll do this automatically).
  • Only works for Google US search at the moment (we hope add other locales soon)
  • As mentioned previously, the predictions are only accurate if you continue SEO activities at the same level of effort (regardless of if that's low, medium, or high).
  • The accuracy varies by the which keyword you choose to analyze the more common the keyword the more accurate the prediction, but the tool works surpassingly well with long tail keywords as well.

Does rank matter anymore?

Many have been discussing the merits of monitoring rank given how much it varies by search, their geolocation, and the influence of the social signals. Given this variance amongst users, the more valuable performance indicators might be traffic or an average rank among a wide range of searchers/locations. However, until a robust method exists to measure average rank across all your keywords, we believe rank is still worthwhile performance indicator (so long as you're also measuring the traffic you receive from said keywords).

We'd love your feedback

Please try the tool out and let us know what you think of the tool in the comments below!

Try Out Future Rank


Do you like this post? Yes No

Which Link Metrics Should I Use? Part 2 of 2 - Whiteboard Friday

Posted: 31 Mar 2011 02:19 PM PDT

Posted by Aaron Wheeler

 We all know that, at first, it can be really difficult to decide what the most valuable link metrics are and when to use them. Last week, Rand outlined and defined a variety of metrics that are used to assess the respective values of domains, pages, and links between them.  This week, he's back with the stunning conclusion: how to actually use these link metrics in your research and how to choose which metrics to use for given research situations. If you were ever confused about when you should be using PageRank and when you should be using mozRank, fret no longer!

 

Video Transcription

Howdy, SEOmoz fans! Welcome to another edition of Whiteboard Friday. Today the exciting conclusion, Part 2 of 2, on which link metrics to use. So, last week we discussed in depth a ton of the link metrics that are available, what they mean, what they do, how you can interpret them. Today I want to walk through some of the specific tasks and projects that you are going to be involved in when you are doing SEO kinds of things and which metrics can help you to accomplish those specific tasks.

First up, let's say I am doing a high level SERPs analysis, something like the keyword difficulty tool output report where it is trying to give me a sense of who is in the top 10, who is in the top 20. Why are they ranking there? Is it because they are an exact match domain? Do they have a lot of good anchor text? Do they have a ton of links into them? Is it because their domain is important or their page is important? We can look at a few key metrics. I like looking at page authority, which is that aggregate of all the mozMetrics and domain authority and then maybe the number of linking roots and C-blocks just to give me a rough idea of kind of what I am dealing with. That high level SERPs analysis is great when I am doing like a keyword difficulty report trying to determine which keywords to go after, whether it is roughly possible for me to be ranking in those sectors.

If I want to do some link quality analysis, so I am looking particularly at a link and trying to determine is this link helping me to rank? Is it potentially hurting me? If I am looking maybe at a client's website, say I was doing consulting or I am a new SEO in an in-house position and I am trying to analyze whether some links that were built previously are questionable or not, there are some really good ways to do that. One of my favorites is looking at PageRank versus mozRank and mozTrust.

Normally, what you should see is that PageRank and mozRank are pretty close. If PageRank is a 3 and mozRank is like a 4.5, it might be okay. It's a little on the border. If is a 3 and a 3.5, oh, that's, you know, that's perfectly fine. That's normal. We should expect that. If, however, I am looking at like a 3 and a mozRank is like a 5.8, something is fishy, right? Clearly, I mean, Google probably knows about more links than SEOmoz does and mozRank, boy, for it to be that high and PageRank to be that low, something might be up. Something might be going on where this site is selling links, Google has caught them, they are doing something manipulative. This could be a problem. Then I also like comparing mozTrust, because a lot of times, you won't see PR scores, especially for a lot of new sites and pages. Google hasn't gotten the data there, or they have an updated PR, but that site has built a lot of links in the meantime. By the way, you do want to be careful of that too when you are comparing PR and MR. But mozRank and mozTrust, if I see like a 5.8 and a 7.2, this is probably a phenomenal link. If I see a 5.8 and a 2.2, that's really, that's a bad sign. That usually means that this page, this site or this page has gotten a lot of links, but from a lot of very questionable sources. Otherwise, their mozTrust should be quite a bit higher.

So, those types of analyses along with looking at not just the number of links but the number of external versus internal links, if it's a lot of internal links, maybe that is boosting up the ranking, but it will be easier to overcome than a high number of external links and followed/no-followed. If it is a lot of no-followed links coming to the site, oh that is a different story than if all the links are followed.

Now, if I am looking at outreach and popularity, I am trying to say, how popular is this blog? How important is this news website? How big and popular on the Web do I think this forum is or Q&A site or community? Then, I want to be looking at some of those high level metrics, but I might want to dive sort of one step deeper and look at, yes, domain authority. I really care about domain metrics here, right? Not individual pages on those sites. So, I am looking at Domain mozRank and Domain mozTrust, which are the same thing as mozRank and mozTrust but on the domain wide level, and then I might care a lot about the linking roots and C-blocks, because that tells me a raw popularity count. How many people on the Web have referenced this guy compared to other people?

Now, if I am looking and trying to sort by the most helpful links to raise my ranking, say I am analyzing a set of 50 blogs and I want to decide, who am I going to guest blog for first? Who do I really think is going to be providing that value? Or I have the opportunity to sponsor or speak at a conference or contribute in some way, and I know that I can contribute the content or whatever I need to, to get those links. I really care a little bit less about the metrics and a few about these big three questions. So, I would ask you before you look at the metrics to ask yourselves these three questions, particularly if you are doing that sort of detail level analysis.

Number one, how well does that page or that site rank? If you search for a few keywords that are in the title tag of this particular page or the homepage of the site and it does not come up in the number one or number two positions, that might not be a good sign. If you search for four or five keywords that compose a phrase in the title and it is still not coming up, something is seriously wrong there. There might be some issue with that site in Google.

How relevant and useful is it? Is this site going to send actual traffic? Was the link editorially given? Is it a true citation that represents an endorsement from one site, one page to another? If that is not the case, you might be in trouble in the future. Even if Google hasn't caught it yet, Bing hasn't caught it yet, in the future, that might be causing problems. It is just not worth it. Go spend your time on other links that are editorial, sincere citations.

Do the sites and pages it links to rank well? This is a great way to analyze directories or link lists or those kinds of things and say, oh, this looks highly relevant. It is a pretty good site. If the pages that it is linking to don't rank well for their keywords, that's a bad sign. If a few of them don't, okay maybe, you know, everybody links to a few bad apples. But if a lot of them are not ranking well, something is going on there, right?

Next, I might look at some metrics like mozRank versus PageRank as we did above, mozRank versus mozTrust, the number of links and linking root domains just to get a sense of these. But those three questions, more so than any metric, are going to really answer the question of how helpful will this particular page or site be in raising my rankings if I get a link from them. Next, second to last here, is sorting of links. So if I want to do a rough or a raw sort, I have a bunch of links that I exported from Google, that I exported from a tool that ran that analyzed a bunch of pages and figured out whether there was usefulness. Maybe I used the – in SEOmoz Labs there is that great tool to help me find all the search queries that I could use to find potential links. I think it is the, what is that called? I think it is the Link Acquisition Assistant. So, the Link Acquisition Assistant might export a bunch of raw lists of pages, and if I want to do some just raw sorting to get a general sense of importance before I start asking these questions, PA/DA are really good for that and so is number of linking roots. So inside the web app, you will see a lot of these. We tend to show at least those three metrics on most everything so you can do a rough sort.

Finally, last but not least, if I am doing a deep SERPs analysis, where I really want to know why does this particular page, why does this particular site rank where it does? Why is this 3 and this 2 and this 4? I want every metric I can get my hands on. The reason is because when you analyze these things all together in Excel, you can see weak points, strong points. You can get a sense of what Google is using or Bing is using in that particular grouping or algorithmic result to try to determine who should rank higher and lower, and that will give you a great sense of what you might need to do to accomplish those rankings.

All right everyone, I hope that this two part Whiteboard Friday extravaganza has been great for you. I look forward to the comments on the blog. Take care.

Video transcription by SpeechPad.com


Do you like this post? Yes No

Niciun comentariu:

Trimiteți un comentariu