luni, 16 aprilie 2012

Mish's Global Economic Trend Analysis

Mish's Global Economic Trend Analysis


Trends in Nonfarm Employment, Civilian Employment, Weekly Unemployment Claims

Posted: 16 Apr 2012 09:31 PM PDT

Initial unemployment claims have generally been trending lower, but in a very choppy manner. One way to smooth out the weekly claims reports is to use seasonal adjustments. Another way is to use 4-week moving averages.

However, both methods are subject to fluctuations around floating holidays such as Thanksgiving and Easter.

Adding a couple of extra weeks to the moving averages and comparing not-seasonally-adjusted numbers to the same six weeks in prior years helps even more. Here are a couple of charts to consider.

Unemployment Claims 6-Week Moving Average vs. Same 6 Weeks in Prior Years



Certainly the number of initial claims has fallen dramatically but claims are still above levels from 2004 through 2008.

Workers With a Job Covered by Unemployment Insurance



The above chart shows 6-week moving averages of employees with a job and with benefits, compared to the same 6 weeks in prior years. Self-employed are not eligible for unemployment benefits.

Reader Tim Wallace who prepared the above charts at my request writes ...
Hello Mish

Anyway, here is the last six weeks filing average compared to prior years. Note that only those years in known recession are greater than the current number of claims in spite of the steep drop.

Also note the six-week moving average of covered employees shows we are at 2001 levels.

However, the economy did legitimately add 1,474,871 jobs in the past year, the first upward trend since 2008. Unfortunately, we cannot determine the quality of said jobs. Moreover, the economy will need to add 17 million more jobs just to get  back to employment levels of 2001 on an equal percentage basis to the workforce.
Here are a few more charts and statistics to consider.

Nonfarm Employment



In terms of actual non-farm employment, this "recovery" is now back to a level as seen in 2001. Note that the length of each recovery period has gotten longer and longer.

Here are some additional charts from my April 6 report Nonfarm Payroll +120,000, Unemployment Rate Fell .1 to 8.2%, Record 87,897,000 "Not in Labor Force"

Nonfarm Employment - Payroll Survey - Annual Look - Seasonally Adjusted



Actual employment is about where it was just prior to the 2001 recession.
 
 Quick Notes About the Unemployment Rate

  • US Unemployment Rate dropped .01 to 8.2%
  •  
  • In the last year, the civilian population rose by 3,604,000. Yet the labor force only rose by 1,315,000. Those not in the labor force rose by 2,289,000.
  •  
  • The Civilian Labor Force fell by 164,000.
  •  
  • Those "Not in Labor Force" increased by 310,000. If you are not in the labor force, you are not counted as unemployed.
  •  
  • Those "Not in Labor Force" is at a new record high of 87,897,000.
  •  
  • By the Household Survey, the number of people employed fell by 31,000.
  •  
  • By the Household Survey, over the course of the last year, the number of people employed rose by 2,270,000.
  •  
  • Participation Rate fell .1 to 63.8%
  •  
  • Were it not for people dropping out of the labor force, the unemployment rate would be well over 11%.

Over the past several years people have dropped out of the labor force at an astounding, almost unbelievable rate, holding the unemployment rate artificially low. Some of this was due to major revisions last month on account of the 2010 census finally factored in. However, most of it is simply economic weakness.

Between January 2008 and February 2010, the U.S. economy lost 8.8 million jobs.

Since a recent employment low in February 2010, nonfarm payrolls have expanded by 3.6 million jobs. Of the 8.8 million jobs lost between January 2008 and February 2010, 41 percent have been recovered.

Statistically, 125,000+- jobs a month is enough to keep the unemployment rate flat. For a discussion, please see Question on Jobs: How Many Does It Take to Keep Up With Demographics?

The average employment gain over the last 25 months has been 143,000, barely enough (statistically speaking) to make a dent in the unemployment rate. Thus, the unemployment rate fell because millions dropped out of the labor force.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com
Click Here To Scroll Thru My Recent Post List


Spain Government May Take Over Some Regions' Finances; Spanish 10-Year Yield His 6.15%, CDS at Record High; Mission Impossible

Posted: 16 Apr 2012 12:24 PM PDT

Madrid Threatens Intervention as Regional Debt Worries Mount

The Financial Times reports Madrid threatens to intervene in regions
Madrid has threatened to seize budgetary control of wayward Spanish regions as early as May if they flout deficit limits, officials said – as investors took fright at the fragility of some eurozone economies.

Concerns about overspending by Spain's 17 autonomous regions and fears that its banks will need to be recapitalised with emergency European Union funds undermined confidence in the country's sovereign bonds, forcing down prices and pushing yields up above 6 per cent on Monday – towards levels considered unsustainable.

The cost of insuring the country's debt also rose, with Spanish credit default swaps jumping to a record 510 basis points, according to Markit, the data provider.

One possible candidate for intervention is Andalucia in the south, Spain's most populous region, which has attacked Mr Rajoy's austerity measures. Mr Rajoy's Popular party had hoped to win a regional election last month and oust the leftwingers who have run Andalucia for 30 years but the PP did not get enough votes and the left remains in control.

Andalucia, however, is not alone in failing to obey fiscal rules. All the major political parties, including the PP, have exceeded deficit targets in the regions they administer.
Spain Government May Take Over Some Regions' Finances

The Wall Street Journal has additional details in its report Spain Government May Take Over Some Regions' Finances
Spain's government Monday warned it could take control of finances in some of its autonomous regions to slash one of Europe's largest budget deficits and shore up investor confidence.

A top government official, who asked not to be named, told journalists there will soon be new tools to control regional spending. Parliament is expected to pass legislation by the end of this month allowing Madrid to force spending cuts, impose fines and take over financial management in regions breaching budget targets or falling into deep difficulties.

The official said Madrid may move take over at least one of the country's cash-strapped regions this year, as lack of access to financial markets and plummeting tax revenue are undermining their capacity to fund themselves.

"The way things are going, the regions themselves will request the intervention," the official said. "There are regions with no access to funding, no way to pay bills. That's why we are going to have to intervene."

The official added Madrid should have more information by May on the state of regional finances, and on which might need to be taken over. The government has set up a new credit line to regions so they can pay off large debts to their suppliers. To access the facility they must present a plan to pay back the money and provide detailed information on their finances.

Separately, Spanish Education Minister Jose Ignacio Wert met regional education authorities to agree a series of measures, including larger class sizes and longer teachers' hours, to cut EUR3 billion from regional budgets. The government meets regional health officials Wednesday to find another EUR7 billion in cuts. These two social services account for the bulk of regional spending, and the regions have long complained they can't reduce it unless Madrid changes regulations governing the services they must provide.
Mission Impossible

Wolfgang Münchau a Financial Times columnist says Spain has accepted mission impossible
News coverage seems to suggest that the markets are panicking about the deficits themselves. I think this is wrong. The investors I know are worried that austerity may destroy the Spanish economy, and that it will drive Spain either out of the euro or into the arms of the European Stability Mechanism.

The orthodox view, held in Berlin, Brussels and in most national capitals (including, unfortunately, Madrid), is that you can never have too much austerity. Credibility is what matters. When you miss the target, you must overcompensate to hit it next time. The target is the goal – the only goal.

This view does not square with the experience of the eurozone crisis, notably in Greece. It does not square with what we know from economic theory, or from economic history. And it does not square with the simple though unscientific observation that the periodic episodes of market panic about Spain have always tended to follow an austerity announcement.

European policy makers have a tendency to treat fiscal policy as a simple accounting exercise, omitting any dynamic effects. The Spanish economist Luis Garicano made a calculation, as reported in El País, in which the reduction in the deficit from 8.5 per cent of GDP to 5.3 per cent would require not a €32bn deficit reduction programme (which is what a correction of 3.2 per cent would nominally imply for a country with a GDP of roughly €1tn), but one of between €53bn and €64bn. So to achieve a fiscal correction of 3.2 per cent, you must plan for one almost twice as large.

Spain's effort at deficit reduction is not just bad economics, it is physically impossible, so something else will have to give. Either Spain will miss the target, or the Spanish government will have to fire so many nurses and teachers that the result will be a political insurrection.
Market Screams Mission Impossible As Well

I too have been preaching the "Mission Impossible" idea for months if not years. More importantly, the market has once again latched on to that idea with credit default swaps on Spain's sovereign debt at record highs, and the yield on 10-year Spanish bonds back above 6% today, settling at 6.07% after reaching a high of 6.156% according to Bloomberg.

History suggests the more eurocrats resists a default for Spain, the bigger the resultant mess. Greece should be proof enough. However, eurocrat clowns have no common sense, no economic sense either, and they do not care about history. Expect a gigantic eurozone mess as efforts to kick the can do nothing but make matters worse.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com
Click Here To Scroll Thru My Recent Post List


Mish on Coast-to-Coast Radio Tonight 10:00PM PST Discussing the Global Economy

Posted: 16 Apr 2012 10:15 AM PDT

I will be on Coast-to-Coast late night syndicated radio talk show with George Noory to discuss the state of the US and global economies, jobs, stimulus efforts by the FED and ECB, housing, healthcare, the stock market, and gold.

Most major metropolitan areas of the country have a station that will pick up the broadcast. About three million listeners tune in nightly.

Here is a link to a map of Where to Listen to Coast to Coast in your area.

You can also pick up the broadcast on SIRIUS XM Satellite Radio - XM TALK 168.

Please tune in if you can. My scheduled time is at the beginning of the show but there may be a lead-in of general news for 5 minutes or so first.

I will see if Coast-to-Coast will agree to let me take come calls from listeners, and if so perhaps I will be on longer.

Update:

I will be on for most of the entire first hour, not 15 minutes as originally posted. There will be a couple short general news items before I come on.

The show is rebroadcast several hours later. Thus for those on the East coast, listeners can tune in at 1:00AM or 5:00AM.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com
Click Here To Scroll Thru My Recent Post List


How Authorship (and Google+) Will Change Linkbuilding

How Authorship (and Google+) Will Change Linkbuilding


How Authorship (and Google+) Will Change Linkbuilding

Posted: 15 Apr 2012 02:18 PM PDT

Posted by Tom Anthony

Google's relationship with links has changed over the last 15 years - it started out as a love affair but nowadays the Facebook status would probably read: "It's Complicated". I think Google are beginning to suffer from trust issues, brought about by well over a decade of the SEO community manipulating the link graph. In this post I'm going to lay out how I think Authorship, and Google+ are one of the ways that Google are trying to remedy this situation.

I'll move on to what that means we should be thinking about doing differently in the future, and am sharing a free link-building tool you can all try out to experiment with these ideas. The tool will allow you to see who is linking to you rather than where is linking to you, and will provide you with social profiles for these authors, as well as details of where else they write.

To start I want to quickly look at a brief history of Google's view of links.

Are links less important than they were?

Back in the early days Google treated all links as being equal. A link in the footer was as good as a link in the main content, a link in bad content was as good as a link in good content, and so on. However, then the new generation of SEOs arrived and started 'optimizing' for links. The black hats created all sorts of problems, but the white hats were also manipulating the link graph. What this meant was now Google had to begin scrutinizing links to decide how trust-worthy they were.

Every link would be examined for various accompanying signals, and it would be weighted according to these signals. It was no longer a case of all links being equal. Reciprocal links began to have a diminished effect, links in footers were also not as powerful, and so it went for a variety of other signals. Over the last decade Google have begun using a wide range of new signals for determining the answer to the question they have to answer for every single link: How much do we trust this link?

They've also introduced an increasing number of signals for evaluating pages beyond the link based signals that made them. If we look at the ranking factors survey results from SEOmoz for 2011 we see that link based factors make up just over 40% of the algorithm. However, in the 2009 survey they were closed to 55% of the algorithm.

So in the last 2 years 15% of the algorithm that was links has been replaced by other signals in relative importance. The results are from a survey, but a survey with people who live and breathe this stuff, and it seems to match up well with what the community as a whole believes, and what we observe with the increasing importance of social signals and the like.

This reduction in the relative power of links seems to imply that Google aren't able to trust links as much as they once did. Whilst clear they are still the backbone of the algorithm, it is clear Google has been constantly searching for other factors to offset the 'over-optimization' that links have suffered from. 

Are social signals the answer?

The SEO community has been talking a lot about social signals the last couple of years, and whether they are going to replace links. I'd argue that social signals can tell you a lot about trust, timeliness, perhaps authority and other factors, but that they are quite limited in terms of relevancy. Google still need the links - they aren't going anywhere anytime soon.

To visualise this point in a different way, if we look at a toy example of the Web Graph. The nodes represent websites (or webpages) and the connections between them as the links between these websites:

Illustration of the Web Graph

And a corresponding toy example of the Social Graph:

Illustration of the Social Graph

We can now visualise Social 'Votes' (be they likes/tweets/+1s/pins or shares of some other type) for different websites. We can see that nodes on the Social Graph send their votes to nodes on the Web Graph:

Illustration of Social Votes

The Social Graph is sending signals over to the websites. They are basically saying 'Craig likes this site', or 'Rand shared this page'. In other words, the social votes are signals about web sites/pages and not about the links -- they don't operate on the graph in the same manner as links.

Whilst social signals do give Google an absolute wealth of information, they don't directly help improve the situation with links and how some links are more trustworthy than others.

Putting the trust back into links

So Google have needed to find a way to provide people with the ability to improve the quality of a link, to verify that links are trust-worthy. I believe that verifying the author of a link is a fantastic way to achieve this, and it fits neatly into the model.

In June last year Google introduced rel author, the method that allows a web page to announce the author of the page by pointing to a Google+ profile page (which has to link back to the site for 2 way verification).

We're seeing the graphs merge into a new Web Graph augmented by author data, where some links are explicitly authored links:

WebGraph showing Authored Links

With this model it isn't: 'Distilled linked to SEOmoz' but it is 'Tom Anthony linked on Distilled to Rand Fishkin on SEOmoz'. It's the first time there has been a robust mechanism for this.

This is incredibly powerful for Google as it allows them to do exactly what I mentioned above - they can now verify the author of a web page. This gives 2 advantages:

  • Knowing this is an authored link, by a human who they have data about, they can place far more trust in a link. Its likely that a link authored manually by a human is of higher quality, and that a human is unlikely to claim responsibility for a link if it is spammy.
  • Furthermore it allows them to change the weighting of links according to the AuthorRank of the author who placed the link.

The latter point is very important, it could impact how links can pass link juice. I believe this will shift the link juice model towards:

AuthorRank x PageRank = AuthoredPageRank

I've shown it here as a simple multiplication (and without all the other factors I imagine go into this), but it highlights the main principle: authors with a higher AuthorRank (as determined by both their social standing and by the links coming into their authored pages, I'd imagine):

Examples of Authored PageRank

The base strength of the link still comes from the website, but Rand is a verified author who Google know a lot about and as he a strong online presence, so multiplies the power of links that he authors.

I'm a less well-known author, so don't give as much of a boost to my links as Rand would give. However, I still give links a boost over anonymous authors, because Google now trust me a bit more. They know where else I write, that I'm active in the niche, and socially etc.

Where to Who

So what does all this imply that you do? The obvious things are ensuring that you (and your clients) are using authorship markup, and of course you should try to become trustable in the eyes of Google. However, if you're interested in doing that stuff, you probably were already doing it.

The big thing is that we need a shift in our mindset from where we are getting links from to who we are getting links from. We need to still do the traditional stuff, sure, but we need to ask start thinking about ‘who’ more and more. Of course, we do that some of the time already. Distilled noticed when Seth Godin linked to our Linkbait Guide. I noticed when Bruce Schneier linked to me recently, but we need to begin doing this all in a scalable fashion.

With OpenSiteExplorer, Majestic and many other linkbuilding tools we have a wide array of tools that allow us to look at where we are getting links from in a scalable way.

I hope I've managed to convince you that we need to begin to examine this from the perspective that Google increasingly will be. We need tools for looking at who is linking to who. Here's the thing - all the information we need for this is out there. Let me show you...

Authored links - A data goldmine

Gianluca's Dear Google PostWe'll examine an example post from GIanluca Fiorelli that he posted in December. Gianluca is using Google's authorship markup to highlight he is the author of this post.

Lets take a look at what information we can pull out from this markup.

The rel author attribute in the HTML source of the page points to his Google+ page, from there we can establish a lot of details about Gianluca:

Authorship Markup leads to Google+ profile info

 

We can from his Google+ profile establish where Gianluca lives, his bio, where he works etc. We can also get an indicator of his social popularity from the number of Circles that he is in, but also by following examining the other social profiles that he might link to (for example following the link to his Twitter profile and seeing how many Twitter followers he has).

We've talked a lot in the industry in the last couple of years about identifying influencers in a niche, and about building relationships with people. Yet, there is an absolute abundance of information available about authors of links we or our competitors already have -- why are we not using it!?!

All of this data can be crawled and gathered automatically, exactly in the way that Google crawls the authorship markup, which allows us to begin thinking about building the scalable sorts of tools I have mentioned. In the absence of any tools, I went right ahead and built one...

AuthorCrawler - A tool for mining Author Data for Linkbuilding

I first unveiled this tool a couple of weeks ago at LinkLove London, but I'm pleased to release it publicly today. (As an aside, if you like getting exclusive access to cool toys like this then you should check out SearchLove San Fran in June or MozCon in July).

AuthorCrawler is a free, open-source tool that pulls the backlinks to a URL, crawls the authorship markup on the page, and gives you a report of who is linking to a URL. It is fully functional, but it is a proof-of-concept tool, and isn't intended to be an extensive or robust solution. However, it does allow us to get started experimenting with this sort of data in a scalable way.

When you run the report, you'll get something similar to this example report (or take a look at the interactive version) I ran for SEOmoz.org:

AuthorCrawler Single URL report

It pulls the top 1000 backlinks for the homepage, and then crawled each of them looking for authorship markup, which if found is followed to crawl for the authors data (no. Circles, Twitter followers), and very importantly it also pulls the 'Contributes to' field from Google+ so you can see where else this author writes. It might be that you find people linking to your site that also write elsewhere, on maybe more powerful sites, so these are great people to build a relationship with - they are already aware of you, warm to you (they're already linking) and could provide links from other domains.

You can sort the report by the PA/DA of where the link was placed, or by the social follower counts of the authors. You can also click through to the authors Google+ and Twitter profiles to quickly see what they're currently up to.

I'm pretty excited by this sort of report and I think it opens up some creative ideas for new approaches to building both links and relationships. However, I still felt we could take this a little bit further.

I'm sure many of you will know the link intersect tool, in the labs section of SEOmoz. It allows you to enter your URL, and the URLs of other domains in your niche (most likely your competitors, but not necessarily), and it examines the back links to each of these and reports on domains/pages that are linking to multiple domains in your niche. It also reports whether you currently have a link from that page - so you can quickly identify some possible places to target for links. Its a great tool!

So, I took the principle from the link intersect tool and I applied the authorship crawling code to create an Author Intersect tool. It will give you a report that looks like this (you can check the interactive example report also):

Multi URL report from AuthorCrawler tool

Now what you have is really cool - you have a list of people who are writing about your niche, who are possibly linking to your competitors, whose social presence you can also see at a glance. These are great people to reach out to build relationships with - they are primed to link to you!

The tool is pretty simple to use - if you're unsure there is an instructions page on the site to get you started.

Wrap Up

We are in the early days of authorship, but I think Google are going to keep on pushing Google+ hard, and I think authorship's importance is just going to increase. Correspondingly - I think tools such as this re going to become an increasing part of an SEOs toolkit in the next 12 months, and I'm excited to see where it goes.

I've only just begun to dig into the ways we can use tools like these - so I'd love to hear from others what they get up to with it. So go and download the tool and try it out. Have fun! :)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

How to Check Which Links Can Harm Your Site's Rankings

Posted: 15 Apr 2012 03:35 AM PDT

Posted by Modesto Siotos

Matt Cutts' statement in March 2012 that Google would be rolling out an update against “overoptimised” websites, caused great turmoil within the SEO community. A few days later thousands of blogs were removed from Google's index and Matt tweeted confirming that Google had started taking action against blog networks.

Even though thousands of low-quality blogs of low or average authority were manually removed from Google's index, they weren't the only victims. For instance, www.rachaelwestdesigns.com, a PR7, DA70 domain was also removed, probably due to the very high number of blog roll (site-wide) backlinks.

These actions indicate that the new update on "overoptimised" websites has already begun to roll out but it is uncertain how much of it we have seen so far.

At around the same time Google sent to thousands webmasters the following message via message via Google's Webmaster Tools:

In the above statement, it is unclear what Google’s further actions will be. In any case, working out the number of “artificial” or “unnatural links” with precision is a laborious, almost impossible task. Some low quality links may not be reported by third party link data providers, or even worse, because Google has started deindexing several low quality domains, the task can end-up being a real nightmare as several domains cannot be found even in Google's index.

Nevertheless, there are some actions that can help SEOs assess the backlink profile of any website. Because, in theory, any significant number of low quality links could hurt, it would make sense gathering as many data as possible and not just examine the most recent backlinks. Several thousand domains have already been removed from Google's index, resulting in millions of links being completely devalued according to Distilled's Tom Anthony (2012 Linklove).

Therefore, the impact on the SERPs has already been significant and as always happens in these occasions there will be new winners and losers once the dust settles. However, at this stage it is be a bit early to make any conclusions because it is unclear what Google's next actions are going to be. Nevertheless, getting ready for those changes would make perfect sense, and spotting them as soon as they occur would allow for quicker decision making and immediate actions, as far as link building strategies are concerned.

As Pedro Dias, an Ex-Googler from the search quality/web spam team tweetted, "Link building, the way we know it, is not going to last until the end of the year" (translated from Portuguese).

The Right Time For a Backlinks Risk Assessment

Carrying out a backlinks audit in order to identify the percentage of low-quality backlinks would be a good starting point. A manual, thorough assessment would only be possible for relatively small websites as it is much easier to gather and analyse backlinks data – for bigger sites with thousands of backlinks that would be pointless. The following process expands on Richard Baxter's solution on 'How to check for low quality links', and I hope it makes it more complete.

  1. Identify as many linking root domains as possible using various backlinks data sources.
  2. Check the ToolBar PageRank (TBPR) for all linking root domains and pay attention on the TBPR distribution
  3. Work out the percentage of linking root domains that has been deindexed
  4. Check social metrics distribution (optional)
  5. Repeat steps 2,3 and 4 periodically (e.g. weekly, monthly) and check for the following:
  • A spike towards the low end of the TBPR distribution
  • Increasing number of deindexed linking root domains on a weekly/monthly basis
  • Unchanged numbers of social metrics, remaining in very low levels

A Few Caveats

The above process does come with some caveats but on the whole, it should provide some insight and help making a backlinks' risk assessment in order to work out a short/long term action plan. Even though the results may not be 100% accurate, it should be fairly straightforward to spot negative trends over a period of time.

Data from backlinks intelligence services have flaws. No matter where you get your data from (e.g. Majestic SEO, Open Site Explorer, Ahrefs, Blekko, Sistrix) there is no way to get the same depth of data Google has. Third party tools are often not up to date, and in some cases the linking root domains are not even linking back anymore. Therefore, it would make sense filtering all identified linking root domains and keep only those still linking to your website. At iCrossing we use a proprietary tool but there are commercial link check services available in the market (e.g. Buzzstream, Raven Tools).

ToolBar PageRank gets updated infrequently (roughly 4-5 times in a year), therefore in most cases the returned TBPR values represent the TBPR the linking root domain gained in the the last TBPR update. Therefore, it would be wise checking out when TBPR was last updated before making any conclusions. Carrying out the above process straight after a TBPR update would probably give more accurate results. However, in some cases Google may instantly drop a site's TBPR in order to make public that the site violates their quality guidelines and discourage advertisers. Therefore, low TBPR values such as n/a, (greyed out) or 0 can in many cases flag up low quality linking root domains.

Deindexation may be natural. Even though Google these days is deindexing thousands of low quality blogs, coming across a website with no indexed pages in Google's SERPs doesn’t necessarily mean that it has been penalised. It may be an expired domain that no longer exists, an accidental deindexation (e.g. a meta robots noindex on every page of the site), or some other technical glitch. However, deindexed domains that still have a positive TBPR value could flag websites that Google has recently removed from its index due to guidelines violations (e.g. link exchanges, PageRank manipulation).

Required Tools

For large data sets NetPeak Checker performs faster than SEO Tools, where large data sets can make Excel freeze for a while. NetPeak checker is a standalone free application which provides very useful information for a given list of URLs such as domain PageRank, page PageRank, Majestic SEO data, OSE data (PA, DA, mozRank, mozTrust etc), server responses (e.g. 404, 200, 301) , number of indexed pages in Google and a lot more. All results can then be exported and processed further in Excel.

1. Collect linking root domains

Identifying as many linking root domains as possible is fundamental and relying in just one data provided isn't ideal. Combining data from Web master tools, Majestic SEO, Open Site Explorer may be enough but the more data, the better especially if the examined domain has been around for a long time and has received a large number of backlinks over time. Backlinks from the same linking root domain should be removed so we end up with a long list of unique linking root domains. Also, not found (404) linking root domains should also be removed.

2. Check PageRank distribution

Once a good number of unique linking root domains has been identified, the next step is scrapping the ToolBar PageRank for each one of them. Ideally, this step should be applied only on those root domains that are still linking to our website. The ones that don't should be discarded if not too complicated. Then, using a pivot chart in Excel, we can conclude whether the current PageRank distribution should be a concern or not. A spike towards the lower end values (such as 0s and n/a) should be treated as a rather negative indication as in the graph below.

3. Check for deindexed root domains

Working out the percentage of linking root domains which are not indexed is essential. If deindexed linking root domains still have a positive TBPR value, most likely they have been recently deindexed by Google.

4. Check social metrics distribution (optional)

Adding in the mix the social metrics (e.g. Facebook Likes, Tweets and +1s) of all identified linking root domains may be useful in some cases. The basic idea here is that low quality websites would have a very low number of social mentions as users wouldn't find them useful. Linking root domains with low or no social mentions at all could possibly point towards low quality domains.

5. Check periodically

Repeating the steps 2, 3 and 4 on a weekly or monthly basis, could help identifying whether there is a negative trend due to an increasing number of linking root domains being of removed. If both the PageRank distribution and deindexation rates are deteriorating, sooner or later the website will experience rankings drops that will result in traffic loss. A weekly deindexation rate graph like the following one could give an indication of the degree of link equity loss:

Note: For more details on how to set-up NetPeak and apply the above process using Excel please refer to my post on Connect.icrossing.co.uk.

Remedies & Actions

So far, several websites have seen ranking drops as a result of some of their linking root domains being removed from Google's index. Those with very low PageRank values and low social shares over a period of time should be manually/editorially reviewed in order to assess their quality. Such links are likely to be devalued sooner or later, therefore a new link building strategy should be devised. Working towards a more balanced PageRank distribution should be the main objective, links from low quality websites will keep naturally coming up to some extent.

In general, the more authoritative & trusted a website is, the more low quality linking root domains could be linking to it without causing any issues. Big brands' websites are less likely to be impacted because they are more trusted domains. That means that low authority/trust websites are more at risk, especially if most of their backlinks come from low quality domains, have a high number of site-wide links, or if their backlink profile consists of unnatural anchor text distribution.

Therefore, if any of the above issues have been identified, increasing the website's trust, reducing the number of unnatural site-wide links and making the anchor text distribution look more natural should be the primary remedies.

About the author

Modesto Siotos (@macmodi) works as a Senior Natural Search Analyst for iCrossing UK, where he focuses on technical SEO issues, link tactics and content strategy. Modesto is happy to share his experiences with others and posts regularly on Connect, a UK digital marketing blog.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Nominate a Hero for the 2012 Citizens Medal

The White House Your Daily Snapshot for
Monday, April 16, 2012
 

Nominate a Hero for the 2012 Citizens Medal

The Citizens Medal is one of our nation's highest civilian honors. For the past two years, President Obama has relied on you to identify heroes in your community – extraordinary Americans whose work provides inspiration for others to serve.

Nominations for the 2012 Presidential Citizens Medal are now open, and you can nominate a hero in your community until Tuesday, April 24th:

Nominate a hero for the 2012 Citizens Medal

In Case You Missed It

Here are some of the top stories from the White House blog:

President Obama at the Summit of the Americas
In his opening remarks at the Summit on the Americas, President Obama laid out the issues for discussion -- including trade.

President Obama Wants You to Know How Your Tax Dollars Are Spent
The Federal Taxpayer Receipt shows how your tax dollars are being spent.

Weekly Address: It’s Time for Congress to Pass the Buffett Rule
President Obama urges Congress to pass the Buffett Rule -- which asks those who make more than $1 million a year to pay at least the same percentage of their income in taxes as middle class families -- as a principle of fairness.

Today's Schedule

All times are Eastern Daylight Time (EDT).

9:00 AM: White House Summit on Environmental Education Part 1 WhiteHouse.gov/live

1:30 PM: White House Summit on Environmental Education Part 2 WhiteHouse.gov/live

WhiteHouse.gov/live Indicates that the event will be live-streamed on WhiteHouse.gov/Live

Get Updates

Sign up for the Daily Snapshot

Stay Connected

This email was sent to e0nstar1.blog@gmail.com
Manage Subscriptions for e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Unsubscribe | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111