luni, 6 februarie 2012

Mish's Global Economic Trend Analysis

Mish's Global Economic Trend Analysis


Fewer Nonfarm Employees Now Than December 2000; Unemployment Rate: Some Things Still Don't Add Up; Obamanomics?

Posted: 06 Feb 2012 01:57 PM PST

Along with many others, I am pondering the latest employment numbers. Strong opinions are the norm.

Many are steadfast in their interpretations, some critically so, especially Bondad who blasted Zero Hedge in a scathing attack "No Rick Santelli and Zero Hedge, One Million People Did Not Drop Out of the Labor Force Last Month"

Does Bondad Have a Point?

The short answer is yes. I wonder if I escaped attack because of a statement in my post Nonfarm Payroll +243,000 ; Unemployment Rate 8.3%; Those Not in Labor Force Rose an Amazing 1,177,000 as follows:
Some of those labor force numbers are due to annual revisions. However, the point remains: People are dropping out of the labor force at an astounding, almost unbelievable rate, holding the unemployment rate artificially low.
Emphasis in red as written, not added.

I could have, and should have expounded on the first sentence, even though I stick with what I said in the second sentence.

Bondad is generally a good guy and he also has the coolest dog in blogosphere, even though we fundamentally differ on politics.

Meet Bondad's Dog "Weimar"



Zero Hedge replied in Explaining Yesterday's Seasonally Adjusted Nonfarm Payroll "Beat"

Does Zero Hedge have a point?

Yes, even though he is scrambling hard to make it.

Been there. Done that. It happens. No one is perfect, certainly not me.

Trim Tabs has Firm Opinion Too



BLS Data Skewed?

Based on federal income tax receipts Trim Tabs asks Is BLS Data Skewed?
Our estimate of a slowly growing economy is based primarily upon daily income tax collections. Either there is something massively changed in the income tax collection world, or there is something very suspicious about today's Bureau of Labor Statistics hugely positive number. We continue to check and recheck our analysis of income tax collections. We are aware that another service believes that incomes are growing faster than we do. So far we have not found any errors or discrepancies in our work, but if we do, we will let you know.

I keep repeating that the BLS refuses to use the data embedded in income tax collections to be able to report real time jobs and wages. Why does it refuse? Could the reason it refuses to use real time data on jobs and incomes be because perhaps this jobs number is politically motivated? The entire world is looking at US job creation as a proxy on how well Obama is doing? Could the Obama administration be pressuring its economist employees to create the best possible new jobs number?
Obamanomics?

Readers should know by now that I discount most conspiracy theories. It's not that I believe conspiracies don't happen, but rather those that do are quickly exposed. Paulson used a bazooka right out in the open to force Bank of America to merge with Merrill Lynch. Geithner and others are guilty as well. It was all very visible and quickly reported.

Is the BLS purposely manipulating numbers to benefit Obama? I rather doubt it. Someone would know and yap.

Yet, I have no explanation for payroll tax data. Some things do not add up, and it's best to look at things from more than one angle.

So let's take a closer scrutiny of the data to see what's happening.

The Case for Headline Payroll +243,000

Let's start off with the absolute best case anyone can make for the bullish jobs case.



The above chart is condensed from the January 2012 Non-Manufacturing ISM Report On Business®

ISM Questions and Answers

  • Was I surprised by the services ISM? Yes, I was. 
  • Does it help explain the headline job number?  Yes, it does.
  • Does it explain the unemployment rate? No, it doesn't.

There are still questions about seasonal adjustments, confirming data, etc. but those are relatively good ISM numbers.

Let's turn our attention to the unemployment rate.

Civilian Labor Force



The BLS labor force numbers seem suspect. The labor force is less now than when the recession ended 2.5 years ago.

Current Labor Force: 154,395,000.
June 2009 Labor Force: 154,730,000.

Based on trends, the labor force ought to be close to 160,000,000.

Boomer demographics can explain part of the "trendline failure", but not all of it. The US is adding work-aged population every year, just at a decreasing rate. In other words, the labor force should be rising, even if at a reduced rate (at least in theory).


What Rate?

In 2000, it took about 150,000 jobs a month to keep up with birthrate and immigration, Recently Bernanke stated the number is 125,000 jobs. Could it be lower? Certainly, but the number is not zero.

Total Nonfarm Employees



There are currently 132,409,000 nonfarm employees. In December of 2000 there were 132,481,000 employees. How's that for job growth?

Civilian Employment



Civilian employment is currently 141,637,000.
In May of 2005 civilian employment was 141,609,000.

Civilian Unemployment Rate



The recession ended in June of 2009. The labor force was 154,730,000. The Labor force is now 154,395,000. Is this credible? If it's not credible, then neither is the unemployment rate!

Unemployment Rate What If?

Labor Force 155,000,000 8.6%
Labor Force 156,000,000 9.2%
Labor Force 157,000,000 9.8%
Labor Force 158,000,000 10.4%
Labor Force 159,000,000 10.9%
Labor Force 160,000,000 11.5%

At a very modest labor force growth to 157 million (a mere 90,800 a month since the recession ended), the unemployment rate would be 9.8%.

Using Bernanke's estimate of 125,000 jobs a month, the labor force would be 158,480,000 and the unemployment rate would be 10.6%. Growing at the trend, the unemployment rate would be 11.5%.

Has Time Rewritten Every Lie?

To paraphrase Barbara Streisand) "Can it be all so simple then, or has time rewritten every lie?"

Are You "Really" Unemployed?



Link if video does not play: The Unemployment Game Show: Are You "Really" Unemployed?

Please play the video. It's hilarious.

In regards to the wild jump in "those not in the labor force" in relation to growth in overall population, the BLS notes "the population increase was primarily among persons 55 and older and, to a lesser degree, persons 16 to 24 years of age. Both these age groups have lower levels of labor force participation than the general population."

Forced Retirement

The BLS argument may sound plausible but I do not buy it. Persons 55-62 will still generally be looking for a job. Even many older than 65 will still be looking for a job because they cannot afford to retire.

Instead, I propose a combination of three factors.

  1. For reasons noted in the "are you unemployed?" video, people stopped looking for jobs and are not counted in the labor force. Yes, this is on purpose but it predates Obama.
  2. Many people have exhausted 100 weeks of unemployment benefits and have no income coming in at all. Some in that category retired and are now gone from the labor force, skimping by on social security benefits.
  3. Still others took part-time work and thus are considered employed.

Gallup Chimes In

Gallup reports that unemployment is 8.6% not seasonally-adjusted. That is close to the BLS number, but Gallup is based on those 18 and older while the BLS is 16 and older.

Otherwise, the sampling metrics are similar. The biggest difference appears in the actual count of underemployment (unemployed + those wanting a full-time job but only finding part-time work).

Please consider a pair of charts from the Gallup report U.S. Unemployment Up, to 8.6% in January

Percentage of US Workers in Part-Time Jobs, Wanting Full-Time Employment



Underemployment, a measure that combines the percentage of workers who are unemployed with the percentage working part time but wanting full-time work, surged to 18.7% in January. This is a worsening from the 18.3% of December but is still below the 19.0% of a year ago.

Total Underemployment



BLS Alternative Measures



click on chart for sharper image

 The BLS "alternative" measure of underemployment is 16.2% (not seasonally adjusted) compared with 18.7% as surveyed by Gallup. As noted above, Gallup does not include results of those aged 16 and 17 while the BLS does (otherwise Gallup's numbers would be higher still).

Which set of numbers tells the better story? Here's a hint "It's not the BLS".

I will stick with what I have said on many occasions "People are dropping out of the labor force at an astounding, almost unbelievable rate, holding the unemployment rate artificially low."

The reason is not a recount based on the 2010 census, nor is it purely demographics, nor is it Obamanomics. The reason is severe and sustained fundamental economic weakness, coupled with existing purposely-distorted definitions of what constitutes "unemployment".

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com
Click Here To Scroll Thru My Recent Post List


Is Romney to Blame for Paying Low Taxes or is 72,536 Pages of Tax Code to Blame? What's the Real Solution? Thanks to AMT, Man Pays 102% Tax Rate

Posted: 06 Feb 2012 11:08 AM PST

I am not a fan of Mitt Romney. Thus I am not displaying bias when I say the ire over what Romney pays in taxes is misplaced.

Caroline Baum has an excellent article on Bloomberg today that points to real source of the problem: Never Mind the Tax Cheats -- Go After the Tax Code
Millionaires paying an effective 15 percent tax rate because their income is from investments? Blame the tax code. Carried interest, a form of income that accrues to hedge fund and private equity managers, taxed at the more favorable capital gains rate? The tax code's the culprit.

Yes, there are a lot of tax cheats out there who aren't playing by the rules. What Obama objects to -- Warren Buffett playing a lower effective tax rate than his secretary -- is ordained by the grotesque, 72,536-page tax code.

In researching a recent column, I went back to "The Flat Tax," published by economists and Hoover Institution fellows Robert Hall and Alvin Rabushka in 1985. They proposed a revenue-neutral flat tax of 19 percent. All income would be taxed once, and only once, at the same rate and as close to the source as possible. "Whenever different forms of income are taxed at different rates or different taxpayers face different rates," they write, "the public figures out how to take advantage of the differential."

Bingo. I'm no tax expert -- I have trouble gathering all the necessary information to take to the accountant once a year -- but I know enough to recognize that the tax code is the problem, not the folks who capitalize on its myriad of loopholes.

Whether it's a flat tax or a national retail sales tax, simpler is better: for each of us and for the economy overall. So the next time the president says he wants everyone to play by the rules, please tell him it's the rules that are broken -- not to mention the rule-makers.
Tax Law Keeps Piling Up


click on chart for sharper images
Image from Tax Law Pile Up

Are the Rules Broken or the Rule-Makers?

If one need collect taxes as close to the source as possible, the same can be said for the source of the problem. Congress cannot resist tinkering, with anything and everything.

The image above shows what 90 years of tinkering have done. Heck, from 1984 until now, 46,236 pages of tax code have been added.

There are breaks for mortgages, charitable deductions, hedge funds, the oil industry, home builders, and too many things to mention. Instead of fixing the problem at the source, Congress added an "Alternative Minimum Tax".

102% Tax rate?

New York Times reporter James Stewart says he was "dismayed" by his own tax rate as compared to Romney, so he invited readers to send e-mails disclosing their tax rates and circumstances. Stewart was "deluged with submissions".

One respondent, James Ross, a founder and managing member of Rossrock, a Manhattan-based private investment firm that focuses on commercial real estate and distressed commercial mortgages beat everyone hands down.

Let's pick up the story from there as reported by the NYT in At 102%, His Tax Rate Takes the Cake
"My entire taxable income, plus some, went to the payment of taxes," Mr. Ross said. "This does not include real estate taxes, sales taxes and other taxes I paid for 2010." When he told friends and family, they were "astounded," he said.

That doesn't mean Mr. Ross pays more in taxes than he earns. His total tax as a percentage of his adjusted gross income was 20 percent, which is much lower than mine.

That's because Mr. Ross has so many itemized deductions. Since taxable income is what's left after itemized deductions like mortgage interest, charitable contributions, and state and local taxes are subtracted, it will nearly always be smaller than adjusted gross income and demonstrates how someone can pay more than 100 percent of taxable income in tax. Mr. Ross must hope that his interest expense will pay off down the road and generate some capital gains.

How could Mr. Ross pay so much? I thought I was the victim of a perfect storm of punitive tax policies, but Mr. Ross's situation is worse.

Like me, he lives and works in New York City, which all but guarantees a high tax rate. Nearly all of his income is earned income and thus fully taxable at top rates. (He said that's not always the case, but given the recent dire condition of real estate, in 2010 he had few capital gains and his carried interest didn't yield any income.) Unlike me, he can't make any itemized deductions, which means his adjusted gross income exceeds $1 million, the level at which New York State eliminates all itemized deductions, except for 50 percent of the value of charitable contributions. Mr. Ross said he gave 11 percent of his adjusted gross income to charity.

That means Mr. Ross can't deduct any interest expense on the money he borrows to finance his real estate investments, which is substantial, nor can he deduct any other expenses or other itemized deductions except for part of his charitable contributions. This means he pays an enormous amount in state and local taxes. Since those are among the deductions that are disallowed when computing the federal alternative minimum tax, Mr. Ross is in turn especially hard hit by the A.M.T.

Mr. Ross said he asked his accountant what he could do. "He said, 'Fire everyone here and move to Florida,' " according to Mr. Ross. He employs 10 people in his New York office.
Seriously Inane Proposals

One extremely misguided soul proposed in a comment on my blog the other day that it would be "fair" if everyone paid the same percentage of their income for things.

Under this proposal, one would need to provide proof of income to buy anything. Then, those with $1 income would get everything for free because a percentage of $1 does not go far. Those who make a $million would pay $400 or whatever for a loaf of bread. Clearly this proposal is inane, yet such misguided ideas are likely behind the absurd complexity of the AMT.


Simple Solution

The simple solution is to scrap the tax code entirely and start all over, with a blank slate.

The only fair way to do things is for everyone to be treated equally. No breaks for homeowners, no alternative minimum tax, no graduated taxes just simple set of flat taxes.

Since we need to promote more savings, I would rather see a national sales tax as opposed to an income tax. Regressive? Nope. It does not have to be.

I propose no tax on food, medicine and medical supplies, shelter, and clothes. Since a huge percentage of income of the poor goes to food shelter and clothes, no one can scream "regressive".

Still, everyone would be treated equally, unlike say a mortgage deduction which only benefits homeowners. Everyone does eat, and need shelter.

How about a combination flat income tax and national sales tax, perhaps split 50-50 keeping the tax collection revenue neutral?

Those who "buy things" other than food, shelter, and clothes (notably the wealthy), would perforce pay a higher share yet everyone would be treated equally under the law.

Moreover, a sales tax is as close to the source as one can get. So is an income tax with no deductions.

What we cannot do is think 72,536 pages of tax code can be fixed. It can't. It's time to start all over with a blank slate and ideally 500 pages of tax code or less.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com
Click Here To Scroll Thru My Recent Post List


Huge Plunge In Petroleum and Gasoline Usage

Posted: 06 Feb 2012 12:26 AM PST

Inquiring minds are watching a plunge in Petroleum Distillates and Gasoline usage.

Reader Tim Wallace writes
Hello Mish

As I have been telling you recently, there is some unprecedented data coming out in petroleum distillates, and they slap me in the face and tell me we have some very bad economic trends going on, totally out of line with such things as the hopium market - I mean stock market.

This past week I actually had to reformat my graphs as the drop off peak exceeded my bottom number for reporting off peak - a drop of ALMOST 4,000,000 BARRELS PER DAY off the peak usage in our past for this week of the year.

I have added a new graph to my distillates report, a "Graph of Raw Data" to which I have added a polynomial trendline. You can easily see that the plunge is accelerating and more than rivals 2008/09 and in gasoline is greatly exceeding the rate.

An amazing thing to note is that in two out of the last three weeks gasoline usage has dropped below 8,000,000 barrels per day.

The last time usage fell that low was the week of September 21, 2001! And you know what that week was! Prior to that you have to go back to 1996 to have a time period truly consistently below 8,000. We have done it two out of the last three weeks.

The second graph once again shows the year on year change in usage of distillates. The Obama "stimulus" package and Fed monetary actions masked the underlying systemic problems.

The third and final graph shows the changes in usage off the peak year of 2007. Once again you can see the effect of the stimulus and how now we are heading below 2008/09 in an accelerating fashion.

Looking at these numbers I believe we are about to have a surge in unemployment - by the end of April latest, possibly as early as beginning of March.

Tim
Petroleum Distillates and Gasoline Usage in Barrels per Day



click on any chart for sharper image

Note that on a best curve fit, petroleum usage is back to 1997 level and gasoline usage is back to 2001 level. Moreover, as Wallace points out, two out of the last three weeks gasoline usage has dropped below 8,000,000 barrels per day.

Year-Over-Year Petroleum and Gasoline Usage (Compared to Peak Usage)



Note the trough of the recent recession, the rebound, and now a sudden plunge in gasoline and petroleum usage once again.

Decline from Peak Usage



A mild winter can explain part of the drop in petroleum usage (heating oil), but it does not explain the declines in gasoline usage or the overall trends.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com
Click Here To Scroll Thru My Recent Post List


Damn Cool Pics

Damn Cool Pics


Freaky FFXIII Shiva Sisters Cosplay

Posted: 06 Feb 2012 03:22 PM PST

Video game designers are constantly trying to design characters that are hard to duplicate in real life. And every time, creative cosplayers find a way to do it. Take a look at the Shiva Sisters, Styria and Nix, from the game Final Fantasy XIII. That's impressive, to be sure. That's the work of "the-mirror-melts".










Via: unrealitymag


The Wild, Wild Web: Wrestling Online Privacy [infographic]

Posted: 06 Feb 2012 01:26 PM PST



According to Carnegie Mellon researchers, information listed on social media may be enough to guess a social security number, the key to identity theft. And with mobile banking apps, more and more people are logging sensitive information from their smart phones. Add confusing Terms of Service agreements into the mix (they take an average of 10 minutes each to read!), and it's easy to see why online privacy can feel mystifying.

The following infographic helps explain some of the biggest issues in web safety and gives tips on how to keep yourself protected, from passwords to privacy policies. With a few steps, you can be confident that you control what you share online.

Click on Image to Enlarge.

Via: frudaldad


Let's Move! is Turning Two

The White House

Your Daily Snapshot for
Monday, February 6, 2012

 

Let's Move! is Turning Two  

This week marks the second anniversary of Let’s Move!, the First Lady’s initiative to solve the problem of childhood obesity within a generation. Today at 2:30 p.m. EST, join Sam Kass, Assistant Chef and Senior Policy Advisor for Healthy Food Initiatives, for a special session of Let’s Move! Office Hours.

Want to ask Sam a question? Find out how to get involved.

Behind the Scenes: December 2011 Photos

Follow our Flickr Photostream to see some behind-the-scenes photos of the President as he welcomes home Iraq veterans, takes Bo for a walk, and does the coin toss for the Army-Navy game.

WH Behind the Scenes December 2011

President Barack Obama plays with Bo, the Obama family dog, aboard Air Force One during a flight to Hawaii, Dec. 23, 2011. (Official White House Photo by Pete Souza)

In Case You Missed It

Here are some of the top stories from the White House blog:

Weekly Address: It’s Time for Congress to Act to Help Responsible Homeowners
President Obama continues his call for a return to American values, including fairness and equality, as part of his blueprint for an economy built to last.

Weekly Wrap Up: Hanging Out with America
A glimpse at what happened this week at WhiteHouse.gov.
 
From the Archives: Startup America White Board
A White House White Board released for the launch of Startup America last year explains how the initiative will help entrepreneurs avoid the "valley of death" when starting new ventures.

Today's Schedule

All times are Eastern Standard Time (EST).

11:00 AM: The Vice President visits Florida State University WhiteHouse.gov/live

12:00 PM: The President receives the Presidential Daily Briefing

12:30 PM: Press Briefing by Press Secretary Jay Carney WhiteHouse.gov/live

2:30 PM: The President meets with senior advisors

2:45 PM: The Vice President attends a campaign event 
 
4:30 PM: The Vice President attends a campaign event

WhiteHouse.gov/live Indicates that the event will be live-streamed on WhiteHouse.gov/Live

Get Updates

Sign up for the Daily Snapshot

Stay Connected

 

This email was sent to e0nstar1.blog@gmail.com
Manage Subscriptions for e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Click here to unsubscribe | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111

 

Find Your Site's Biggest Technical Flaws in 60 Minutes

Find Your Site's Biggest Technical Flaws in 60 Minutes


Find Your Site's Biggest Technical Flaws in 60 Minutes

Posted: 05 Feb 2012 01:14 PM PST

Posted by Dave Sottimano

I've deliberately put myself in some hot water to demonstrate how I would do a technical SEO site audit in 1 hour to look for quick fixes, (and I've actually timed myself just to make it harder). For the pros out there, here's a look into a fellow SEO 's workflow; for the aspiring, here's a base set of checks you can do quickly.

I've got some lovely volunteers who have kindly allowed me to audit their sites to show you what can be done in as little as 60 minutes.

I'm specifically going to look for crawling, indexing and potential Panda threatening issues like:

  1. Architecture (unnecessary redirection, orphaned pages, nofollow)
  2. Indexing & Crawling (canonical, noindex, follow, nofollow, redirects, robots.txt, server errors)
  3. Duplicate content & On page SEO (repeated text, pagination, parameter based, dupe/missing titles, h1s, etc..)

Don't worry if you're not technical, most of the tools and methods I'm going to use are very well documented around the web.

Let's meet our volunteers!

Here's what I'll be using to do this job:

  1. SEOmoz toolbar - Make sure highlight nofollow links is turned on - so you can visibly diagnose crawl path restrictions
  2. Screaming Frog Crawler - Full website crawl with Screaming Frog (User agent set to Googlebot) - Full user guide here
  3. Chrome, and Firefox (FF will have Javascript, CSS disabled and User Agent as Googlebot) - To look for usability problems caused by CSS or Javascript
  4. Google search queries - to check the index for issues like content duplication, dupe subdomains, penalties etc..

Here are other checks I've done, but left out in the interest of keeping it short:

  1. Open Site Explorer - Download a back link report to see if you're missing out on links pointing to orphaned, 302 or incorrect URLs on your site. If you find people linking incorrectly, add some 301 rules on your site to harness that link juice
  2. http://www.tomanthony.co.uk/tools/bulk-http-header-compare/ - Check if the site is redirecting Googlebot specifically 
  3. http://spyonweb.com/ - Any other domains connected you should know about? Mainly for duplicate content
  4. http://builtwith.com/ - Find out if the site is using Apache, IIS, PHP and you'll know which vulnerabilities to look for automatically
  5. Check for hidden text, CSS display:none funniness, robots.txt blocked external JS files, hacked / orphaned pages

My essential reports before I dive in:

  1. Full website crawl with Screaming Frog (User agent set to Googlebot)
  2. A report of everything in Google's index using the site: (1000 results per query unfortunately - this is how I do it)

Down to business...

Architecture Issues

1) Important broken links

We'll always have broken links here and there, and in an ideal world they would all work. Just make sure for SEO & usability that important links (homepage) are always in good shape. The following broken link is on webrevolve homepage that should be pointing to their blog, but returns a 404. This is an important link because it's a great feature and I definitely do want to read more of their content.

   

Fix: Get in there and point that link to the correct page which is http://www.webrevolve.com/our-blog/

How did I find it: Screaming Frog > response codes report

2) Unnecessary Redirection

This happens a lot more than people like to believe. The problem is that when we 301 a page to a new home we often forget to correct the internal links pointing to the old page (the one with the 301 redirect). 

This page http://www.lexingtonlaw.com/credit-education/foreclosure.html 301 redirects to http://www.lexingtonlaw.com/credit-education/foreclosure-2.html

However, they still have internal links pointing to the old page.

  • http://www.lexingtonlaw.com/credit-education/bankruptcy.html?linkid=bankruptcy
  • http://www.lexingtonlaw.com/blog/category/credit-repair/page/10
  • http://www.lexingtonlaw.com/credit-education/bankruptcy.html?select_state=1&linkid=selectstate
  • http://www.lexingtonlaw.com/credit-education/collections.html

Fix: Get in that CMS and change the internal links to point to http://www.lexingtonlaw.com/credit-education/foreclosure-2.html

How did I find it: Screaming Frog > response codes report

3) Multiple subdomains - Canonicalizing the www or non-www version

One of the first basic principles of SEO, and there are still tons of legacy sites that are tragically splitting their link authority by not using redirecting the www to non-www or vice versa.

Sorry to pick on you CVSports :S

  • http://cvcsports.com/
  • http://www.cvcsports.com/

Oh, and a couple more have got their way into Google's index that you should remove too:

  • http://smtp.cvcsports.com/
  • http://pop.cvcsports.com/
  • http://mx1.cvcsports.com/
  • http://ww.cvcsports.com/
  • http://www.buildyourjacket.com/
  • http://buildyourjacket.com/

Basically, you have 7 copies of your site in the index..

Fix: I recommend using www.cvcsports.com as the main page, and you should use your htaccess file to create 301 redirects for all of these subdomains to the main www site.

How did I find it? Google query "site:cvcsports.com -www" (I also set my results number to 100 for check through the index quicker)

4) Keeping URL structure consistent 

It's important to note that this only becomes a problem when external links are pointing to the wrong URLs. *Almost* every back link is precious, and we want to ensure that we get maximum value from each one. Except we can control how we get linked to; without www, with capitals, or trailing slashes for example. Short of contacting the webmaster to change it, we can always employ 301 redirects to harness as much value as possible. The one place this shouldn't happen is on your own site.

We all know that www.example.com/CAPITALS is different to www.example.com/captials when it comes to external link juice. As good SEOs we typically combat human error by having permanent redirect rules to enforce only one version of a URL (ex. forcing lowercase), which may cause unnecessary redirects if someone links in contradiction to redirects.

Here are some examples from our sites:

  • http://www.lexingtonlaw.com/credit-education/rebuild-credit 301's to trailing slash version
  • http://webrevolve.com/web-design-development/conversion-rate-optimisation/ Redirects to the www version

Fix: Determine your URL structure, should they all have trailing slashes, www, lowercase? Whatever you decide, be consistent and you can avoid future problems. Crawl your site, and fix these 

Indexing & Crawling

1) Check for Penalties

None of our volunteers have any immediately noticeable penalties, so we can just move on. This is a 2 second check that you must do before trying to nitpick at other issues.

How did I do it? Google search queries for exact homepage URL and brand name. If it doesn't show up, you'll have to investigate further.

2) Canonical, noindex, follow, nofollow, robots.txt

I always do this so I understand how clued up SEO-wise the developers are, and to gain more insight into the site. You wouldn't check for these tags in detail unless you had just cause (ex. A page that should be ranking isn't

I'm going to combine this section as it requires much more than just a quick look, especially on bigger sites. First and foremost check robots.txt and look through some of the blocked directories, try and determine why they are being blocked and which bots they are blocking them from. Next, get Screaming Frog in the mix as it's internal crawl report will automatically check each URL for Meta Data (noindex, header level nofollow & follow) and give you the canonical URL if there happens to be one.

If you're spot checking a site, the first thing you should do is understand what tags are in use and why they're using them.

Take Webrevolve for instance, they've chosen to NOINDEX,FOLLOW all of their blog author pages.

  • http://www.webrevolve.com/author/tom/ 
  • http://www.webrevolve.com/author/paul/

This is a guess but I think these pages don't provide much value, and are generally not worth seeing in search results. If these were valuable, traffic driving pages, I would suggest they remove NOINDEX but in this case I believe they've made the right choice.

They also implement self-serving canonical tags (yes I just made that up), basically each page will have a canonical tag that points to itself. I generally have no problem with this practice as it usually makes it easier for developers.

Example: http://www.webrevolve.com/our-work/websites/ecommerce/

3) Number of pages VS Number of pages indexed by Google

What we really want to know here is how many pages Google has indexed. There's 2 ways of doing this, using Google Webmaster Tools by submitting a sitemap you'll get stats back on how many URLs are actually in the index.

OR you can do it without having access but it's much less efficient. This is how I would check...

  1. Run a Screaming Frog Crawl (make sure you obey robots.txt)
  2. Do a site: query
  3. Get the *almost never accurate* results number and compare them to total pages in crawl

If the numbers aren't close, like CVCSports (206 pages vs 469 in the index) you probably want to look into it further.

   

I can tell you right now that CVCSports has 206 pages (not counting those that have been blocked by robots.txt). Just by doing this quickly I can tell there's something funny going on and I need to look deeper.

Just to cut to the chase, CVCsports has multiple copies of the domain on subdomains which is causing this.

Fix: It varies. You could have complicated problems, or it might just be as easy as using canonical, noindex, or 301 redirects. Don't be tempted to block the unwanted pages by robots.txt as this will not remove pages from the index, and will only prevent these pages from being crawled.

Duplicate Content & On Page SEO

Google's Panda update was definitely a game changer, and it caused massive losses for some sites. One of the easiest ways of avoiding at least part of Panda's destructive path is to avoid all duplicate content on your site.

1) Parameter based duplication

URL parameters like search= or keyword= often cause duplication unintentionally. Here's some examples:

  • http://www.lexingtonlaw.com/credit-repair-news/economic-and-credit-trends/mortgage-lenders-rejecting-more-applications.html
  • http://www.lexingtonlaw.com/credit-repair-news/economic-and-credit-trends/mortgage-lenders-rejecting-more-applications.html?select_state=1&linkid=selectstate
  • http://www.lexingtonlaw.com/credit-repair-news/credit-report-news/california-ruling-sets-off-credit-fraud-concerns.html
  • http://www.lexingtonlaw.com/credit-repair-news/credit-report-news/california-ruling-sets-off-credit-fraud-concerns.html?select_state=1&linkid=selectstate
  • http://www.lexingtonlaw.com/credit-repair-news/economic-and-credit-trends/one-third-dont-save-for-christmas.html
  • http://www.lexingtonlaw.com/credit-repair-news/economic-and-credit-trends/one-third-dont-save-for-christmas.html?select_state=1&linkid=selectstate
  • http://www.lexingtonlaw.com/credit-repair-news/economic-and-credit-trends/financial-issues-driving-many-families-to-double-triple-up.html
  • http://www.lexingtonlaw.com/credit-repair-news/economic-and-credit-trends/financial-issues-driving-many-families-to-double-triple-up.html?select_state=1&linkid=selectstate

Fix: Again, it varies. If I was giving general advice I would say use clean links in the first place - depending on the complexity of the site you might consider 301s, canonical tags or even NOINDEX. Either way, just get rid of them !

How did I find it? Screaming Frog > Internal Crawl > Hash tag column

Basically, Screaming Frog will create a unique hexadecimal number based on source code. If you have matching hash tags, you have duplicate source code (exact dupe content). Once you have your crawl ready, use excel to filter it out (complete instructions here).

2) Duplicate Text content

Having the same text on multiple pages shouldn't be a crime, but post Panda it's better to avoid it completely. I hate to disappoint here, but there's no exact science to finding duplicate text content.

Sorry CVCSports, you're up again ;)

http://www.copyscape.com/?q=http%3A%2F%2Fwwww.cvcsports.com%2F

Don't worry, we've already addressed your issues above, just use 301 redirects to get rid of these copies

Fix: Write unique content as much as possible. Or be cheap and stick it in an image, that works too. 

How did I find it? I used http://www.copyscape.com, but you can also copy & paste text into Google search

3) Duplication caused by pagination

Page 1, Page 2, Page 3... You get the picture. Over time, sites can accumulate thousands if not millions of duplicate pages because of those nifty page links. I swear I've seen a site with 300 pages for one product page.

Our examples:

  • http://cvcsports.com/blog?page=1
  • http://cvcsports.com/blog?page=2

Are they being indexed? Yes.

Another example?

  • http://www.lexingtonlaw.com/blog/page/23
  • http://www.lexingtonlaw.com/blog/page/22

Are they being indexed? Yes.

Fix: General advice is to use the NOINDEX, FOLLOW directive. (This tells Google not to add this page to the index, but crawl through the page). An alternative might be to use the canonical tag but this all depends on the reason why pagination exists. For example, if you had a story that was separated across 3 pages, you definitely would want them all indexed. However, these example pages are pretty thin and *could* be considered as low quality for Google.

How did I find it? Screaming Frog > Internal links > Check for pagination parameters 

Open up the pages and you'll quickly determine if they are auto generated, thin pages. Once you know the pagination parameter or structure of the URL you can check Google's index like so: site:example.com inurl:page=


Time's up! There's so much more I wish I could do, but I was strict about the 1 hour time limit. A big thank you to the brave volunteers who put their sites forward for this post. There was one site that just didn't make the cut, mainly because they've done a great job technically, and, um, I couldn't find any technical faults.

Now it's time for the community to take some shots at me! 

  • How did I do?
  • What could I have done better? 
  • Any super awesome tools I forgot?
  • Any additional tips for the volunteer sites?

Thanks for reading, you can reach me on Twitter @dsottimano if want to chat and share your secrets ;)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!