sâmbătă, 25 mai 2013

Mish's Global Economic Trend Analysis

Mish's Global Economic Trend Analysis


Corporate Share Buybacks: How Timely Are They?

Posted: 25 May 2013 08:52 AM PDT

Factset Buyback Quarterly has an interesting series of charts and facts on corporate share buybacks.

Here is my favorite chart in the series.


Aggregate Buybacks: Dollar-value share repurchases amounted to $93.8 billion over the fourth quarter and $384.3 billion for 2012. The fourth quarter total is in-line with that of Q3, but represented year-over-year growth of 9.6%.

Sector Trends: The Information Technology and Health Care sectors spent the most on quarterly repurchases ($19.8 billion and $14.4 billion, respectively) in Q4 2012. However, of the sectors that averaged $2 billion or more in quarterly share repurchases since 2005, the Industrials sector showed the largest sequential and year-over-year growth (30.6% and 59.4%) in dollar-value buybacks.

Buyback Conviction: Dollar-value buybacks amounted to 79.1% of free cash flow on a trailing twelve month basis, which is the largest value since Q3 2008. The Consumer Discretionary and Consumer  Staples sectors both spent more than 100% of their free cash flow (116.7% and 114.2%, respectively). The Energy and Utilities sectors spent $35.8 billion and $1.4 billion, respectively, on buybacks, despite generating negative free cash flow (-$25.7 billion and -$23.5 billion). The Consumer Discretionary sector also led all sectors in repurchasing the most shares relative to its size. Over the trailing twelve months, the sector repurchased shares that amounted to 4.5% of the sector's average shares outstanding over the year.
Timing Suspect at Best 

One look at the above chart is all it takes to see most shares are bought back at high prices rather than low prices.

And check out the latest authorizations.
Looking Forward: Program Announcements & Buyback Potential Going forward, several companies in the S&P 500 have authorized new programs or additions of $1 billion or more since December 31st, including Gap (GPS), Blackrock (BLK), Marathon Petroleum (MPC), L-3 Communications (LLL), Visa (V), Allstate (ALL), Moody's (MCO), CBS Corporation (CBS), Dow Chemical (DOW), and AbbVie (ABBV). In addition, even larger authorizations were made by United Technologies Corp. (UTX), 3M Co. (MMM), and Lowe's (LOW), which all announced replacement programs worth approximately $5.4 billion, $7.5 billion, and $5 billion, and Hess Corporation (HES), which announced a $4 billion buyback program on March 4th. Finally, a number of banks were approved to buy back large amounts of common and preferred shares in 2013. JPMorgan Chase (JPM) which was approved for $6 billion in share repurchases, Bank of America (BAC) was approved for $5 billion in share repurchases plus $5.5 billion in redemption of preferred shares, and Bank of New York Mellon (BK), U.S. Bancorp (USB), State Street Corp (STT), and American Express (AXP) were also approved to repurchase greater than $1 billion worth of shares.
Why? Share prices certainly are not cheap.

Much of the buybacks are in conjunction with massive shareholder dilution via stock option grants to executives. The executives continually unload their shares and corporations buy them back.

Buybacks from the last two years generally look good, at least right now. But for how long? 2006 and 2007 buybacks looked good too, up until the crash.

Is this yet another case of "here we go again?"

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com 

Penguin 2.0/4 - Were You Jarred and/or Jolted?

Penguin 2.0/4 - Were You Jarred and/or Jolted?


Penguin 2.0/4 - Were You Jarred and/or Jolted?

Posted: 24 May 2013 06:07 AM PDT

Posted by Dr. Pete

The long-awaited Penguin 2.0 (also called "Penguin 4") rolled out on Wednesday, May 22nd. Rumor has been brewing for a while that the next Penguin update would be big, and include significant algorithm changes, and Matt Cutts has suggested more than once that major changes are in the works. We wanted to give the dust a day to settle, but this post will review data from our MozCast Google weather stations to see if Penguin 2.0 really lives up to the hype.

Short-Term MozCast Data

First things first - the recorded temperature (algorithm "flux") for May 22nd was 80.7°F. For reference, MozCast is tuned to an average temperature of about 70°, but the reality is that that average has slipped into the high 60s over the past few months. Here's a 7-day history, along with a couple of significant events (including Penguin 1.0):

MozCast Temperatures (for 7 days around Penguin 2.0)

By our numbers, Penguin 2.0 was about on par with the 20th Panda update. Google claimed that Penguin 2.0 impacted about 2.3% of US/English queries, while they clocked Panda #20 at about 2.4% of queries (see my post on how to interpret "X% of queries"). Penguin 1.0 was measured at 3.1% of queries, the highest query impact Google has publicly reported. These three updates seem to line up pretty well between temperature and reported impact, but the reality is that we've seen big differences for other updates, so take that with a grain of salt.

Overall, the picture of Penguin 2.0 in our data confirms an update, but it doesn't seem to be as big as many people expected. Please note that we had a data collection issue on May 20th, so the temperatures for May 20-21 are unreliable. It's possible that Penguin 2.0 rolled out over two days, but we can't confirm that observation.

Temperatures by Category

In addition to the core MozCast data, we have a beta system running 10K keywords distributed across 20 industry categories (based on Google AdWords categories). The average temperature for any given category can vary quite a bit, so I looked at the difference between Penguin 2.0 and the previous 7 days for each category. Here they are, in order by most impacted (1-day/7-day temps in parentheses):

  • 33.0% (80°/60°) – Retailers & General Merchandise
  • 31.2% (81°/62°) – Real Estate
  • 30.8% (90°/69°) – Dining & Nightlife
  • 29.1% (89°/69°) – Internet & Telecom
  • 26.0% (82°/65°) – Law & Government
  • 24.4% (79°/64°) – Finance
  • 23.5% (81°/65°) – Occasions & Gifts
  • 20.8% (88°/73°) – Beauty & Personal Care
  • 17.3% (70°/60°) – Travel & Tourism
  • 15.7% (87°/75°) – Vehicles
  • 15.5% (84°/73°) – Arts & Entertainment
  • 15.4% (72°/62°) – Health
  • 15.0% (83°/72°) – Home & Garden
  • 14.2% (78°/69°) – Family & Community
  • 13.4% (79°/70°) – Apparel
  • 13.1% (78°/69°) – Hobbies & Leisure
  • 12.0% (74°/66°) – Jobs & Education
  • 11.5% (88°/79°) – Sports & Fitness
  • 7.8% (75°/70°) – Food & Groceries
  • -3.7% (70°/73°) – Computers & Consumer Electronics

Retailers and Real Estate came in at the top, with just over 30% higher than average temperatures. Consumer Electronics rounded out the bottom, with slightly lower than average flux, oddly. Of course, split 20 ways, this represents a relatively small number of data points for each category. It's useful for reference, but I wouldn't read too much into these breakdowns.

"Big 20" Sub-domains

Across the beta 10K data-set, we track the top sub-domains by overall share of SERP real-estate. Essentially, we count how many page-1 positions each sub-domain holds and divide it across the entire data set. These were the Big 20 sub-domains for the day after Penguin 2.0 hit, along with their SERP share and 1-day change:

  1. 5.66% (+0.29%) – en.wikipedia.org
  2. 2.35% (-0.75%) – www.amazon.com
  3. 2.22% (+3.11%) – www.youtube.com
  4. 1.49% (+6.05%) – www.facebook.com
  5. 1.35% (-8.11%) – www.yelp.com
  6. 0.84% (+4.77%) – twitter.com
  7. 0.58% (+0.37%) – www.webmd.com
  8. 0.58% (+1.87%) – pinterest.com
  9. 0.52% (+1.24%) – www.walmart.com
  10. 0.49% (+4.54%) – www.tripadvisor.com
  11. 0.47% (+0.45%) – www.foodnetwork.com
  12. 0.47% (-0.44%) – allrecipes.com
  13. 0.44% (+1.98%) – www.ebay.com
  14. 0.41% (-0.76%) – www.mayoclinic.com
  15. 0.38% (+1.72%) – www.target.com
  16. 0.37% (-4.37%) – www.yellowpages.com
  17. 0.37% (+0.58%) – popular.ebay.com
  18. 0.36% (+2.12%) – www.huffingtonpost.com
  19. 0.33% (+3.27%) – www.overstock.com
  20. 0.32% (-0.32%) – www.indeed.com

By percentage change, Yelp was the big day-over-day loser, at -8.11%, and Twitter picked up the highest percentage, at +4.77%. In absolute positions, YouTube picked up the most page-1 rankings, and Yelp was still the biggest loser. Overall, the Big 20 occupied 20.00% of the page-1 real estate the day after Penguin 2.0, up from 19.88% the previous day, picking up a modest number of ranking positions.

3rd-Party Analyses

I'd just like to call out a few analyses that were posted yesterday based on unique data, since there are bound to be a lot of speculative posts in the next few weeks. SearchMetrics posted its Penguin 2.0 biggest losers list, with porn and gaming sites taking the heaviest losses (Search Engine Land provided additional analysis). GetStat.com showed a jump in Top 100 rankings for big brands, but relatively small changes for most sites, and most of those changes on pages 3+ of SERPs.
 

Most reports yesterday showed relatively modest day-over-day changes (solid evidence of an algorithm update, but not a particularly big update). One exception was Dejan SEO's Australian flux tracker, Algoroo, which showed massive day-over-day flux. We believe that at least two other major algorithm updates have rolled out in May in the US, so it's possible that multiple updates were combined and hit other countries simultaneously. This is purely speculative, but no other reports seem to suggest changes on the scale of the Australian data.

The May 9th Update

I'd like to also call out an unconfirmed algorithm update in early May. There was a period of heavy flux for a few days at the beginning of the month, which was backed up by webmaster chatter and other 3rd-party reports. Temperatures on May 9th reached 83.3°F. The MozCast 7-day graph appears below:

May 9th Algo Update

The temperature spike on May 5th is unconfirmed, and may have been a test across a small number of data centers (unfortunately, our 10K data for that day was running a separate test and so we can't compare the two data sets). Reports of updates popped up across this time period, but our best guess is May 9th. Interestingly, traffic to MozCast tends to reveal when people suspect an update and are looking for confirmation, and the traffic pattern shows a similar trend:

MozCast May Traffic

Traffic data also suggest that May 5th was probably an anomaly. Private data from multiple SEOs shows sites gradually losing traffic over a couple of days in this period. Unfortunately, we have no clear explanation at this time, and I do not believe that this was directly related to Penguin 2.0. Google did roll out a domain crowding update at some point in the past couple of weeks, which may be connected to the early May data, but we don't have solid evidence either way. At this point, though, I strongly believe that the data indicates a significant algorithm update around May 9th.

Were You Hit by Penguin 2.0?

It's important to keep in mind that all of this is aggregate data. Algorithm updates are like unemployment rates. If the unemployment rate is 10%, the reality for any individual is still binary – you either have a job or you don't. You can weather 20% unemployment if you have a job (although you may worry more), and 5% unemployment is little comfort if you're jobless. I don't want to suggest any lack of empathy for those hit by Penguin 2.0 by suggesting that the update was relatively small, but overall the impact seems to be less jarring and jolting than many people feared. If you were hit, please share your story in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Weekly Address: Giving Thanks to Our Fallen Heroes this Memorial Day

The White House Your Daily Snapshot for
Saturday, May 25, 2013
 

Weekly Address: Giving Thanks to Our Fallen Heroes this Memorial Day

In this week’s address, President Obama commemorates Memorial Day by paying tribute to the men and women in uniform who have given their lives in service to our country.

Watch this week's Weekly Address.

Watch this week's Weekly Address

In Case You Missed It

Responding to the Tornadoes in Oklahoma: On Monday, the President spoke with Oklahoma Governor Mary Fallin to express his concern for those who have been affected by the tornadoes in Oklahoma. The President told Governor Fallin that the administration is committed to providing all the assistance it can to Oklahoma as the response effort unfolds, including approving a Major Disaster Declaration, making federal funding available to support affected individuals, and providing additional federal assistance to support immediate response and recovery efforts.

On Tuesday, President Obama delivered a statement on the devastating tornadoes and severe weather that impacted Oklahoma. He outlined the response efforts underway, and assured the people of Moore and all the affected areas that they would have all the resources that they need at their disposal.

"Americans from every corner of this country will be right there with them, opening our homes, our hearts to those in need.  Because we're a nation that stands with our fellow citizens as long as it takes. We've seen that spirit in Joplin, in Tuscaloosa; we saw that spirit in Boston and Breezy Point.  And that’s what the people of Oklahoma are going to need from us right now. "

Morehouse College: On Sunday, President Obama delivered the commencement address to the 2013 graduates of Morehouse College in Atlanta, GA. The President told the graduates that their generation is uniquely poised for success unlike any generation of African Americans that came before it.

“It is one of the great honors of my life to be able to address this gathering here today,” President Obama told the graduates. He spoke about Morehouse’s history, and “the unique sense of purpose that this place has always infused -- the conviction that this is a training ground not only for individual success, but for leadership that can change the world.”

Meeting with the President of Myanmar: On Monday, President Obama welcomed President Thein Sein of Myanmar to the White House for a bilateral meeting, the first visit to the United States by a leader of that country in almost 50 years. During the meeting, the President recognized President Thein Sein’s leadership in moving Myanmar down a path toward political and economic reform as the driving force for improved relations between our two countries.

“We very much appreciate your efforts and leadership in leading Myanmar in a new direction,” President Obama told President Thein Sein. “We want you to know that the United States will make every effort to assist you on what I know is a long, and sometimes difficult, but ultimately correct path to follow.”

DREAMers: On Wednesday, the President and the Vice President hosted a meeting in the Oval Office with young immigrants and the siblings and spouses of undocumented immigrants. The gathering was an important opportunity for the President and the Vice President to hear directly from people whose families are affected daily by our nation’s broken immigration system. The DREAMers shared how the President’s proposal changed their lives for the better and emphasized that they and their families need a permanent solution that will allow them to fully contribute to the country they call home. As the meeting was wrapping up, the President reiterated his commitment to passing a bipartisan, commonsense immigration reform bill this year.

Gershwin Prize: On Wednesday, as part of the "In Performance at the White House," series, the White House hosted a concert honoring Carole King, the first woman to receive the Library of Congress Gershwin Prize. The Gershwin Prize honors individuals for lifetime achievement in popular music and Wednesday, King joined recording artists James Taylor, Gloria Estefan, Billy Joel, Jesse McCartney, Emeli Sande, and Trisha Yearwood in the East Room as she accepted the award on behalf of the co-writers she worked with throughout her career.

National Defense University: Thursday, President Obama laid out a framework for U.S. counterterrorism strategy as we wind down the war in Afghanistan. President Obama discussed how the threat of terrorism has changed substantially since September 11, 2011 -- and explained his comprehensive strategy to meet these threats.

“The quiet determination; that strength of character and bond of fellowship; that refutation of fear -- that is both our sword and our shield. And long after the current messengers of hate have faded from the world’s memory, alongside the brutal despots, and deranged madmen, and ruthless demagogues who litter history -- the flag of the United States will still wave from small-town cemeteries to national monuments, to distant outposts abroad. And that flag will still stand for freedom.”

Stay Connected


This email was sent to e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Unsubscribe | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111

 

8 Great Tools to use with Google Analytics

8 Great Tools to use with Google Analytics

Link to SEOptimise » blog

8 Great Tools to use with Google Analytics

Posted: 24 May 2013 08:43 AM PDT

We hear so much about great SEO tools that we should be using, but we rarely see much written about tools to help you with your Google Analytics data. Over the past year I have been using more and more tools to help with Google Analytics data, from API extraction to data visualisation, and I wanted to share them with you. Below are just 8 tools that I used for Google Analytics on a regular basis that save me a lot of time.

ScreamingFrog

Who has come across GA traffic disappearing and it being down to some GA code being removed? How long has that taken you to find the pages that no longer has the GA code on?

Over the past few years I have been using ScreamingFrog to check websites for any pages that have missing Google Analytics code, and comparing it against those that have. Once I have a list of URLs with missing GA code, I supply this to the internal team to have them look at it and implement. Once implemented, I then re-run the crawl to ensure the missing pages now have the GA code.

Running a ScreamingFrog report to look for Google Analytics code is very easy and can be done in six steps:

1. Open ScreamingFrog ;)
2. Enter the URL in the search bar
3. Navigate to Configuration > Custom
4. Filter 1 “contains” > UA Code.
5. Filter 2 “does not contain” > UA Code (As shown Below)
6. Hit OK, and then start the crawl.

Once the report has run you will find a list of URLs within the Custom tag, for both pages with and without GA code on them. Export the list of URLs without the Google Analytics code and send them to the developer.

Get ScreamingFrog here.

GA Copy and Paste

The GA Copy and Paste plugin has saved me so much time over the months, where I have had to replicate features or settings across a number of different accounts and profiles. The handy GA Copy and Paste Chrome plugin does exactly what it says on the tin. You navigate to your filter/goal and press the plugin button in the address bar, and press copy. You then create a new filter/goal, press the plugin button again followed by the paste button, and voila! Your filter/goal has been copied across effortlessly.

GA Copy & Paste

As I have already mentioned, this saves me a lot of time, and hopefully it will do the same for you. Get GA Copy and Paste here.

CustomReportSharing.com

Using Segments, custom dashboards and reports is becoming the norm in everyday online marketing. Diving deep into the data to get a more granular view on your marketing activities allows you and the business to make much better decisions. The huge advantage Google Analytics has is the ability to share those features with your colleagues and peers. Now it is even easier to find some great information thanks to those over at CustomReportSharing.com.

Whether you are looking for an eCommerce custom report, or you are looking to start an SEO Dashboard, CustomReportSharing.com has it all. You simply search the forums for the dashboard you are looking for and add it to your account. Some of the features have been provided by some of the brightest minds in the Analytics field, so you can always guarantee quality. You can also upload your own report, segment or dashboard by signing up. Hopefully I will see a few of you over there in the near future.

Check out CustomReportSharing.com here

GA Debugger – Firefox/Chrome

The GA Debugger extension is great for looking at useful information about your GA installation, providing any error or warning messages direct into your JavaScript console. This can be easily located either by selecting Inspect Element and choosing Console in Chrome, or opening Firebug and navigating to the Console in Firefox. I tend to use this tool when checking to ensure correct set-up and that any cookies are being passed correctly.

Get GA Debugger here.

URL builder

We tend to be involved in creating campaigns on a regular basis, with each one needing to be tracked. Whether it is PPC, Display Ads, Media Campaigns or you are tracking traffic generated by your outreach team having an understanding of where your visits come from is essential. For those who are working with single URLs, Google has provided a very simple tool that allows you to add the data that you require for it to generate the required string here.

However, in most cases you would want to create multiple tags in one go, and be able to keep a track of those for future reference. We were conducting a lot of campaigns, and to make things easier I decided to create a spreadsheet that emulated the Google URL Builder. This allowed us to create hundreds of campaign tags very quickly and easily. This has saved me lots of time in the past and I hope it will do the same for you.

Google URL Builder
Daniel’s Campaign Builder Spreadsheet – Please make a copy.

Table Booster

The Table Booster Chrome plugin is a great way to further enhance your view of the data grid within Google Analytics by providing three different visualisations for every column.

Download Table Booster here

SEO Tools for Excel

For those of you who haven’t heard about SEO Tools for Excel, where have you been? You need to go and download it right now, well, after you have finished reading this post. SEO Tools for Excel is great for lots of different SEO jobs, and has automated a lot of processes that we do; however, one part I feel is not used as much as it should is the Google Analytics integration.

The integration allows you to run reports straight from the Google Analytics API (similar to the magic script) into your spreadsheet with no coding required. Once you have set up the report and you have all the data that you need, it becomes purely a case of adding the data into a format that works for you as a business or your client.

Get SEO for Excel here.

Chartelligence

This Chrome plugin has been great. Similar to Panguin, it overlays in your Google Analytics account and provides the dates for the latest Google algorithms to easily identify whether you have been hit by a penalty.

Another awesome feature that Chartelligence has is the ability to upload your own holidays, events and site alterations so that you can see what has affected your website during different periods of the year. Over the past few months, I have been using this plugin more and more, and I feel that you should be to.

Get Chartelligence here.

So that’s it – these are some of the tools I use regularly when looking into Google Analytics. What tools, plugins or extensions do you use to enhance Google Analytics? I would love to hear your thoughts in the comments below or on Twitter @danielbianchini.

Thanks to Dan Barker, Anna Lewis, Charlie Williams and Bridget Randolph for their contributions.

© SEOptimise 8 Great Tools to use with Google Analytics

Plugging the link leaks, part 2 – site canonicalisation

Posted: 22 May 2013 01:39 AM PDT

A couple of months ago we took a look at how you can reclaim links that you are simply throwing away. For the second look at how to fix common link leaks, we're going to look at issues around site canonicalisation, and how easy it is to lose link authority through simple duplicate URLs.

As before, this is a simple way many sites lose valuable links without even realising. By plugging these leaks, we’ve helped clients gain an instant boost in link authority, without irking anyone on Google's anti-spam team.

The many faces of a homepage

The best place to start when digging for link leaks is actually one of the checks you should carry out when looking at a new or prospective client for the first time – how many ways can you find the site homepage? Type your domain name into your browser, and then do the same for all the possible variations:

  • http://www.example.com
  • http://example.com
  • http://www.example.com/index.htm (or index.html, .aspx, .php or whatever the CMS/server uses)
  • http://www.example.com/


Does the homepage come up for more than one of these? If so, take a look in Google using a site: search for each to see if they have been indexed. If they redirect, did they redirect using a 301 redirect, which indicates the file has permanently moved? You can check via a tool such as the Redirect Path Chrome extension or an http header checker. If the redirect is not a 301 (for example, a 302 has been used), you had better go and see if that page has been indexed.

Why does this matter? Well, for all Google's power and expertise, in some ways it works in a very simple, black-and-white manner. As each of these different versions of our homepage could, in theory, contain different content, they are regarded as different pages by the search engines.

This means each of these pages can be considered duplicate content, as Google and the other major search engines see two, or more, pages with the same content. Even worse, this is bad for SEO efforts, as it dilutes your inbound links between these different URLs; if someone links to a different variation of your homepage to the one you expect, that link equity simply leaks away. Got links pointing to the wrong version? They don’t count.

An example

But just how bad can this be? Don't most people simply link to the main version?

Our site's homepage has a large amount of links to both the www and non-www version

Oh. That's bad. That's really bad.

Here you can see with a quick search in Open Site Explorer (and it works just as well with MajesticSEO and Ahrefs) a site that is losing a ton of link equity to its homepage by having links to both the www and non-www versions. A look for the brand name in Google and Bing reveals they have both selected the non-www version to show in their results pages.

If we look at the links for the entire subdomain in MajesticSEO and Open Site Explorer it gets worse.

A look in OSE shows there are many links to the www sub-domain

MajesticSEO also shows links to the www sub-domain that are 'leaking' away

This site is leaking links not just to the homepage, but to multiple pages, and losing huge amounts of link equity.

Once you've checked for your homepage, and looked to see how many links leaks have sprung up as a result, you might want to check to see if these variations have occurred on any inner pages. This is a less common issue than the homepage, but still something we see regularly.

Firstly, to check internally, do a crawl of the site using a tool such as Screaming Frog and navigate to the page titles tab. Here you can filter for duplicate title tags; if you have duplicates, check to see if these are variations of the same URL. If any are, that means that somewhere on the site you have internal links pointing to different URLs for the same resource.

Then, to see if Google is finding duplicate versions, take a look in Google Webmaster Tools. Navigate to the HTML suggestions tool in Diagnostics, and look for duplicate title tags and meta descriptions, again checking to see if there are any pages with multiple URLs for the same content. Wherever you find different ways to reach the same URL, make a note of them for later.

Checking Webmaster Tools for duplicate versions of the same page

Fixing the link leaks

Regardless of where you found your links leaks, you want to plug them. Fortunately, in most cases this is a relatively straightforward job.

Now of course, the best solution to plugging this kind of link leak is to set up your site's architecture so that duplicate URLs cannot occur. Of course, with many CMSs and site set-ups, this can be tricky.

So, the next alternative is to put in place site-wide redirects that automatically redirect to a consistent version of each URL. Canonical issues you should be sure are correct include:

  • Redirecting either www or non-www version (whichever you are not using)
  • Redirecting /index.html, or equivalent, URLs
  • Ensuring all URLs use the same characterisation – normally set as lowercase
  • Use of trailing slash at the end of a URL – do you want to use or not?

Doing site wide redirects this way allows you to capture common duplicate URLs in one stroke. They are not always possible, however, so instead you'll have to do individual redirects. If this is the case, make sure you capture every duplicate page that you have linked to internally, and that it is appearing in Google and Bing's indices. Site: searches and a check of Google Analytics landing pages can also help, as can a thorough check of backlink targets from your tool of choice.

There are not necessarily right or wrong choices of the URL version to use – the important thing is to consolidate all the options into one URL for each page, and to be consistent. In most cases the most powerful, consistent format will be clear. 301 redirects need to be created so that each page or resource can only be reached by one URL. You want to set it up so that if someone links to, or types in a different variation, say http://www.example.com/PageTitle or http://example.com/pagetitle, they are 301 redirected to http://www.example.com/pagetitle. By using 301 redirects, all or nearly all of any link juice that page has accrued will be passed along to the target URL.

In the example above, we added a redirect sending the www versions of each page to the non-www version, and immediately got an extra 120 linking domains for the site’s homepage.

There are, however, times when, sadly, implementing 301 redirects is not possible, either through lack of access to a client's developer, or CMS limitations. When this occurs, our third choice of solution is rel=canonical tags.

Rel=canonical should only be used if other options are not possible, or in conjunction with 301 redirects if you are worried that there are so many variations of a URL out there that you want a 'belt and braces' approach.

The rel=canonical tag gives a strong indicator to the search engines that we want the URL in the tag to be regarded as the primary version of this page, and other variants (marked with the tag) to be regarded as duplicates, not to be indexed. This often works extremely well, but does have a couple of disadvantages.

Firstly, whilst it is a strong hint, and Google will generally honour the canonical request, it does not have to, should it have reason. Google has said that it is better to fix site issues in the first place, or use 301 redirects. Secondly, using this method means that these pages can still be crawled, so you are wasting crawl budget on pages that you don't even want Google to index. Finally, there is no guarantee that other search engines will listen to your canonical tag, and that might be important in certain countries your site may be targeting.

However, despite this, the rel=canonical tag can be helpful if you genuinely can't redirect your duplicate URL variations. Not only will Google generally index your preferred URL, they will also try and consolidate all the links pointing to the different URL variations and apply them to the specified canonical address, getting you back your lost link authority. As Google put it themselves in this handy recent post on rel=canonical mistakes:

The rel=canonical link consolidates indexing properties from the duplicates, like their inbound links, as well as specifies which URL you'd like displayed in search results.

Next you want to check your Screaming Frog internal crawl, and go to the Advanced Export option in the main navigation and select 'Success (200) In Links'. Take this export into Excel, and turn it into a table (ctrl + t). Filter the Destination column by your duplicate URLs (that you noted earlier) to find all the internal links pointing to each non-canonical variation. You can then pass this on to your client or developer so all the incorrect internal links can be fixed. The least you can do is make sure you are not directing any users, and no search engines, to the wrong URLs!

Finally, if you haven't already, verify both the www and non-www versions of your site in Google and Bing Webmaster Tools. Not only will this allow you to spot issues the search engines are having with your site, but you can explicitly request which URL variation you want to show in the SERPs with Google Webmaster Tools, but only if you've proved you own both (another clue that different sub-domains are regarded as separate sites if the above was not enough).

Choosing your preferred domain in Google Webmaster Tools

And that's it. A relatively straightforward process, but as the examples above prove, one that can reveal a rather tasty number of links that are simply leaking away. As with link reclamation, for only a couple of hours' work you can give your site a boost of link authority that you’ve already earned.

Do you have any examples of helping a site leaking links this way? Or advice on how to check, or remedy, multiple URLs for the same resource with a particualr CMS?

© SEOptimise Plugging the link leaks, part 2 – site canonicalisation