sâmbătă, 25 mai 2013

8 Great Tools to use with Google Analytics

8 Great Tools to use with Google Analytics

Link to SEOptimise » blog

8 Great Tools to use with Google Analytics

Posted: 24 May 2013 08:43 AM PDT

We hear so much about great SEO tools that we should be using, but we rarely see much written about tools to help you with your Google Analytics data. Over the past year I have been using more and more tools to help with Google Analytics data, from API extraction to data visualisation, and I wanted to share them with you. Below are just 8 tools that I used for Google Analytics on a regular basis that save me a lot of time.

ScreamingFrog

Who has come across GA traffic disappearing and it being down to some GA code being removed? How long has that taken you to find the pages that no longer has the GA code on?

Over the past few years I have been using ScreamingFrog to check websites for any pages that have missing Google Analytics code, and comparing it against those that have. Once I have a list of URLs with missing GA code, I supply this to the internal team to have them look at it and implement. Once implemented, I then re-run the crawl to ensure the missing pages now have the GA code.

Running a ScreamingFrog report to look for Google Analytics code is very easy and can be done in six steps:

1. Open ScreamingFrog ;)
2. Enter the URL in the search bar
3. Navigate to Configuration > Custom
4. Filter 1 “contains” > UA Code.
5. Filter 2 “does not contain” > UA Code (As shown Below)
6. Hit OK, and then start the crawl.

Once the report has run you will find a list of URLs within the Custom tag, for both pages with and without GA code on them. Export the list of URLs without the Google Analytics code and send them to the developer.

Get ScreamingFrog here.

GA Copy and Paste

The GA Copy and Paste plugin has saved me so much time over the months, where I have had to replicate features or settings across a number of different accounts and profiles. The handy GA Copy and Paste Chrome plugin does exactly what it says on the tin. You navigate to your filter/goal and press the plugin button in the address bar, and press copy. You then create a new filter/goal, press the plugin button again followed by the paste button, and voila! Your filter/goal has been copied across effortlessly.

GA Copy & Paste

As I have already mentioned, this saves me a lot of time, and hopefully it will do the same for you. Get GA Copy and Paste here.

CustomReportSharing.com

Using Segments, custom dashboards and reports is becoming the norm in everyday online marketing. Diving deep into the data to get a more granular view on your marketing activities allows you and the business to make much better decisions. The huge advantage Google Analytics has is the ability to share those features with your colleagues and peers. Now it is even easier to find some great information thanks to those over at CustomReportSharing.com.

Whether you are looking for an eCommerce custom report, or you are looking to start an SEO Dashboard, CustomReportSharing.com has it all. You simply search the forums for the dashboard you are looking for and add it to your account. Some of the features have been provided by some of the brightest minds in the Analytics field, so you can always guarantee quality. You can also upload your own report, segment or dashboard by signing up. Hopefully I will see a few of you over there in the near future.

Check out CustomReportSharing.com here

GA Debugger – Firefox/Chrome

The GA Debugger extension is great for looking at useful information about your GA installation, providing any error or warning messages direct into your JavaScript console. This can be easily located either by selecting Inspect Element and choosing Console in Chrome, or opening Firebug and navigating to the Console in Firefox. I tend to use this tool when checking to ensure correct set-up and that any cookies are being passed correctly.

Get GA Debugger here.

URL builder

We tend to be involved in creating campaigns on a regular basis, with each one needing to be tracked. Whether it is PPC, Display Ads, Media Campaigns or you are tracking traffic generated by your outreach team having an understanding of where your visits come from is essential. For those who are working with single URLs, Google has provided a very simple tool that allows you to add the data that you require for it to generate the required string here.

However, in most cases you would want to create multiple tags in one go, and be able to keep a track of those for future reference. We were conducting a lot of campaigns, and to make things easier I decided to create a spreadsheet that emulated the Google URL Builder. This allowed us to create hundreds of campaign tags very quickly and easily. This has saved me lots of time in the past and I hope it will do the same for you.

Google URL Builder
Daniel’s Campaign Builder Spreadsheet – Please make a copy.

Table Booster

The Table Booster Chrome plugin is a great way to further enhance your view of the data grid within Google Analytics by providing three different visualisations for every column.

Download Table Booster here

SEO Tools for Excel

For those of you who haven’t heard about SEO Tools for Excel, where have you been? You need to go and download it right now, well, after you have finished reading this post. SEO Tools for Excel is great for lots of different SEO jobs, and has automated a lot of processes that we do; however, one part I feel is not used as much as it should is the Google Analytics integration.

The integration allows you to run reports straight from the Google Analytics API (similar to the magic script) into your spreadsheet with no coding required. Once you have set up the report and you have all the data that you need, it becomes purely a case of adding the data into a format that works for you as a business or your client.

Get SEO for Excel here.

Chartelligence

This Chrome plugin has been great. Similar to Panguin, it overlays in your Google Analytics account and provides the dates for the latest Google algorithms to easily identify whether you have been hit by a penalty.

Another awesome feature that Chartelligence has is the ability to upload your own holidays, events and site alterations so that you can see what has affected your website during different periods of the year. Over the past few months, I have been using this plugin more and more, and I feel that you should be to.

Get Chartelligence here.

So that’s it – these are some of the tools I use regularly when looking into Google Analytics. What tools, plugins or extensions do you use to enhance Google Analytics? I would love to hear your thoughts in the comments below or on Twitter @danielbianchini.

Thanks to Dan Barker, Anna Lewis, Charlie Williams and Bridget Randolph for their contributions.

© SEOptimise 8 Great Tools to use with Google Analytics

Plugging the link leaks, part 2 – site canonicalisation

Posted: 22 May 2013 01:39 AM PDT

A couple of months ago we took a look at how you can reclaim links that you are simply throwing away. For the second look at how to fix common link leaks, we're going to look at issues around site canonicalisation, and how easy it is to lose link authority through simple duplicate URLs.

As before, this is a simple way many sites lose valuable links without even realising. By plugging these leaks, we’ve helped clients gain an instant boost in link authority, without irking anyone on Google's anti-spam team.

The many faces of a homepage

The best place to start when digging for link leaks is actually one of the checks you should carry out when looking at a new or prospective client for the first time – how many ways can you find the site homepage? Type your domain name into your browser, and then do the same for all the possible variations:

  • http://www.example.com
  • http://example.com
  • http://www.example.com/index.htm (or index.html, .aspx, .php or whatever the CMS/server uses)
  • http://www.example.com/


Does the homepage come up for more than one of these? If so, take a look in Google using a site: search for each to see if they have been indexed. If they redirect, did they redirect using a 301 redirect, which indicates the file has permanently moved? You can check via a tool such as the Redirect Path Chrome extension or an http header checker. If the redirect is not a 301 (for example, a 302 has been used), you had better go and see if that page has been indexed.

Why does this matter? Well, for all Google's power and expertise, in some ways it works in a very simple, black-and-white manner. As each of these different versions of our homepage could, in theory, contain different content, they are regarded as different pages by the search engines.

This means each of these pages can be considered duplicate content, as Google and the other major search engines see two, or more, pages with the same content. Even worse, this is bad for SEO efforts, as it dilutes your inbound links between these different URLs; if someone links to a different variation of your homepage to the one you expect, that link equity simply leaks away. Got links pointing to the wrong version? They don’t count.

An example

But just how bad can this be? Don't most people simply link to the main version?

Our site's homepage has a large amount of links to both the www and non-www version

Oh. That's bad. That's really bad.

Here you can see with a quick search in Open Site Explorer (and it works just as well with MajesticSEO and Ahrefs) a site that is losing a ton of link equity to its homepage by having links to both the www and non-www versions. A look for the brand name in Google and Bing reveals they have both selected the non-www version to show in their results pages.

If we look at the links for the entire subdomain in MajesticSEO and Open Site Explorer it gets worse.

A look in OSE shows there are many links to the www sub-domain

MajesticSEO also shows links to the www sub-domain that are 'leaking' away

This site is leaking links not just to the homepage, but to multiple pages, and losing huge amounts of link equity.

Once you've checked for your homepage, and looked to see how many links leaks have sprung up as a result, you might want to check to see if these variations have occurred on any inner pages. This is a less common issue than the homepage, but still something we see regularly.

Firstly, to check internally, do a crawl of the site using a tool such as Screaming Frog and navigate to the page titles tab. Here you can filter for duplicate title tags; if you have duplicates, check to see if these are variations of the same URL. If any are, that means that somewhere on the site you have internal links pointing to different URLs for the same resource.

Then, to see if Google is finding duplicate versions, take a look in Google Webmaster Tools. Navigate to the HTML suggestions tool in Diagnostics, and look for duplicate title tags and meta descriptions, again checking to see if there are any pages with multiple URLs for the same content. Wherever you find different ways to reach the same URL, make a note of them for later.

Checking Webmaster Tools for duplicate versions of the same page

Fixing the link leaks

Regardless of where you found your links leaks, you want to plug them. Fortunately, in most cases this is a relatively straightforward job.

Now of course, the best solution to plugging this kind of link leak is to set up your site's architecture so that duplicate URLs cannot occur. Of course, with many CMSs and site set-ups, this can be tricky.

So, the next alternative is to put in place site-wide redirects that automatically redirect to a consistent version of each URL. Canonical issues you should be sure are correct include:

  • Redirecting either www or non-www version (whichever you are not using)
  • Redirecting /index.html, or equivalent, URLs
  • Ensuring all URLs use the same characterisation – normally set as lowercase
  • Use of trailing slash at the end of a URL – do you want to use or not?

Doing site wide redirects this way allows you to capture common duplicate URLs in one stroke. They are not always possible, however, so instead you'll have to do individual redirects. If this is the case, make sure you capture every duplicate page that you have linked to internally, and that it is appearing in Google and Bing's indices. Site: searches and a check of Google Analytics landing pages can also help, as can a thorough check of backlink targets from your tool of choice.

There are not necessarily right or wrong choices of the URL version to use – the important thing is to consolidate all the options into one URL for each page, and to be consistent. In most cases the most powerful, consistent format will be clear. 301 redirects need to be created so that each page or resource can only be reached by one URL. You want to set it up so that if someone links to, or types in a different variation, say http://www.example.com/PageTitle or http://example.com/pagetitle, they are 301 redirected to http://www.example.com/pagetitle. By using 301 redirects, all or nearly all of any link juice that page has accrued will be passed along to the target URL.

In the example above, we added a redirect sending the www versions of each page to the non-www version, and immediately got an extra 120 linking domains for the site’s homepage.

There are, however, times when, sadly, implementing 301 redirects is not possible, either through lack of access to a client's developer, or CMS limitations. When this occurs, our third choice of solution is rel=canonical tags.

Rel=canonical should only be used if other options are not possible, or in conjunction with 301 redirects if you are worried that there are so many variations of a URL out there that you want a 'belt and braces' approach.

The rel=canonical tag gives a strong indicator to the search engines that we want the URL in the tag to be regarded as the primary version of this page, and other variants (marked with the tag) to be regarded as duplicates, not to be indexed. This often works extremely well, but does have a couple of disadvantages.

Firstly, whilst it is a strong hint, and Google will generally honour the canonical request, it does not have to, should it have reason. Google has said that it is better to fix site issues in the first place, or use 301 redirects. Secondly, using this method means that these pages can still be crawled, so you are wasting crawl budget on pages that you don't even want Google to index. Finally, there is no guarantee that other search engines will listen to your canonical tag, and that might be important in certain countries your site may be targeting.

However, despite this, the rel=canonical tag can be helpful if you genuinely can't redirect your duplicate URL variations. Not only will Google generally index your preferred URL, they will also try and consolidate all the links pointing to the different URL variations and apply them to the specified canonical address, getting you back your lost link authority. As Google put it themselves in this handy recent post on rel=canonical mistakes:

The rel=canonical link consolidates indexing properties from the duplicates, like their inbound links, as well as specifies which URL you'd like displayed in search results.

Next you want to check your Screaming Frog internal crawl, and go to the Advanced Export option in the main navigation and select 'Success (200) In Links'. Take this export into Excel, and turn it into a table (ctrl + t). Filter the Destination column by your duplicate URLs (that you noted earlier) to find all the internal links pointing to each non-canonical variation. You can then pass this on to your client or developer so all the incorrect internal links can be fixed. The least you can do is make sure you are not directing any users, and no search engines, to the wrong URLs!

Finally, if you haven't already, verify both the www and non-www versions of your site in Google and Bing Webmaster Tools. Not only will this allow you to spot issues the search engines are having with your site, but you can explicitly request which URL variation you want to show in the SERPs with Google Webmaster Tools, but only if you've proved you own both (another clue that different sub-domains are regarded as separate sites if the above was not enough).

Choosing your preferred domain in Google Webmaster Tools

And that's it. A relatively straightforward process, but as the examples above prove, one that can reveal a rather tasty number of links that are simply leaking away. As with link reclamation, for only a couple of hours' work you can give your site a boost of link authority that you’ve already earned.

Do you have any examples of helping a site leaking links this way? Or advice on how to check, or remedy, multiple URLs for the same resource with a particualr CMS?

© SEOptimise Plugging the link leaks, part 2 – site canonicalisation

Niciun comentariu:

Trimiteți un comentariu