vineri, 8 octombrie 2010

Michael Gray - Graywolf's SEO Blog

Michael Gray - Graywolf's SEO Blog


How to Diagnose and Improve Website Crawling

Posted: 07 Oct 2010 11:00 AM PDT

Post image for How to Diagnose and Improve Website Crawling

When you are reviewing a website, whether for your own projects or for a client project, one of the important areas to review is crawlability. In this post I’d like to talk about some of the ways you can look for and diagnose crawling issues.

If your important pages aren’t within 2-3 pages of linking hubs on your website, you will have problems …
The first step to diagnosing a crawling problem is to use a simple [site:example.com] search and compare how many pages you really have with how many Google thinks you have. Now, bear in mind that this number is an estimate. What you are trying to do is get a rough estimate of how many pages Google knows about, as Matt Cutts recently discussed in a Webmaster Central Video:

Click here to view the embedded video.

If you have several hundred or thousand pages but Google only shows 100, then you have a problem. Depending on how large the site is, anywhere from 10-30% accuracy would be a good rule of thumb.

The second thing you would want to look at would be Webmaster Central. If you submit a sitemap, Google tells you how many URLs you submitted and how many are in the index. The closer those numbers are, the better. Don’t worry if it’s not a 100% match because sometimes you include pages in your sitemap that get blocked at the page level with a robots meta tag. At this point, you are just concerned with gross numbers.

Sitemap Statistics

If things are radically out of whack, you can download a table of pages in the index from webmaster central and diagnose on a page by page level to see what is or or isn’t in the index.

Next, you want to try and do a full crawl of the website using something like Xenu. While it’s usually used to check for broken links, in the process it does crawl the website. If you have a large website, you are going to want to limit the crawling.

Another product that I like to use is Website Auditor. One of the interesting things about using Website Auditor is that you can specify crawling depth, which is  how deep you want a crawl to go. Start at the homepage and go only one level. Run it again, this time with 2 levels, then 3. Additionally use your Webmaster Central report on most linked pages (think of them as link hubs). If your important pages aren’t within 2-3 pages of linking hubs on your website, you will have problems. IMHO it’s more important than ever to cultivate deep linking and to use that deep linking to spread your link equity, inbound trust, and authority wisely around your website.

In recent years Google has done away with the term/classification “supplemental index.” IMHO this was more of a public relations move, as they just grew tired of hearing from people who were upset that any part of their site was in the supplemental index–but I digress. There are certain parts of your website that aren’t as important as others or, as in the case of say a privacy policy, are important to people but not for rankings. To help you understand what pages Google thinks are important, you need to look at last crawl date in the Google Cache.

Pages that have the most links are going to get crawled more frequently. Pages that have the most trust and authority are going to get crawled most often. Pages that are linked to from those linking hubs, or trusted and authoritative hubs, will get crawled next most frequently. At each step away from the linking hubs, or authority points, crawl frequency will decrease–think of it like a classic pagerank model.

Ideally, what you want to do is get a sample of pages from different levels and determine their crawl frequency rates. Programs like Website Auditor will do this for you; however, you are probably very likely to trip up the Google automated query blocker, which means that, if you are going to use it, you’ll have to have someone sit there and do captcha’s for a few hours. A secondary method is to use an outsourcing service (I use ODesk) and have them do it for you. Send them several hundred URLs in a spreadsheet and explain to them how to check the cache date and enter it in the spreadsheet. You should do some spot checking when it comes back and try to find a handful of people you can trust. Use them on a regular basis.

So how do you know you’re in trouble? Do your important pages have crawl dates older than 60 days? Are there entire sections that you think are important that aren’t getting crawled as frequently as you want?

If you find that the site you are working on has crawling issues, look for ways to flatten out the site hierarchy. I talked about this in How do You Archive Posts on a High Volume Website. Look at your pages that are linking hubs: are you using them wisely by interlinking to other content? I talked about this in How to Silo Your Website: The Content and breadcrumbs are another key tactic for interlinking. Look into ways to rotate some of that older content onto your homepage (see how to make your homepage more dynamic). Lastly, look for ways to better use your link equity by performing a content audit and killing/removing/updating old, outdated, or unimportant sections.

So what are the takeaways from this post:

  • Take a quick estimate of how well crawled your site is
  • Look at the pages in the index using webmaster central
  • Identify link hubs on your website
  • Try test crawling to different depths
  • Check cache dates across a section of URLs for your website
  • Identify trouble spots, flatten site architecture, improve interlinking, and trim down unimportant pages/sections

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

How to Diagnose and Improve Website Crawling

tla starter kit

Related posts:

  1. Deep Crawling a Mini Site Part II After correcting some “errors” in my mini-site architecture (see Deep...
  2. How To Figure Out What Parts of Your Website Aren’t Being Crawled When Google took away the supplemental index last year, they...
  3. Superficial Crawling SEO Strategies On WebmasterWorld people are discussing big daddy strategies, and on...
  4. How To Silo Your Website: The Breadcrumb Trail In Part 1 we looked at How To Silo Your...
  5. 6 Tools & Tips to Help You Improve Your Blog Posts I was having a a few conversations on twitter last...

Advertisers:

  1. Text Link Ads - New customers can get $100 in free text links.
  2. BOTW.org - Get a premier listing in the internet's oldest directory.
  3. Ezilon.com Regional Directory - Check to see if your website is listed!
  4. Page1Hosting - Class C IP Hosting starting at $2.99.
  5. Directory Journal - List your website in our growing web directory today.
  6. Content Customs - Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  7. Majestic SEO - Competitive back link intellegence for SEO Analysis
  8. Glass Whiteboards - For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm - Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech - Great Web Hosting service at a great price.

Niciun comentariu:

Trimiteți un comentariu