joi, 1 decembrie 2011

Tips for Ajax for SEO Graywolf's SEO Blog

Tips for Ajax for SEO Graywolf's SEO Blog


Tips for Ajax for SEO

Posted: 01 Dec 2011 10:23 AM PST

Post image for Tips for Ajax for SEO

Whenever Ajax enter the conversation for an SEO or internet marketer, chances are good there will always be a deep sigh or an “ugh” face. While it is true that search engines are getting better at indexing this type of content, we still aren’t at the point that you can realistically rely on them to index it properly or even at all. That doesn’t mean you can’t use it, it just means you need to take some extra steps to make sure that type of content is visible to crawlers and non Ajax users.

The first step you need to do is make sure a static page and URL exists for every end result of content. For example, let’s say you run a local travel website, and you have a location/map page that lets people view restaurants, hotels, attractions, or other information within a specific area. Users can turn filters on or off, look at different locations, and get detailed information about each venue. It would be a very good user experience to have that work via Ajax and JavaScript, similar to Google maps integrating data from Google places. However “hiding” all that information behind Ajax won’t help you with your organic search traffic.

What you need to do is create specific unique URLs for each of those destinations. These URLs need to provide the information in way that ALL the search engine spiders can read and extract, not just the advanced experimental Ajax crawling spider from Google. This insures you will get traffic from Yahoo, Bing, Facebook, Twitter, Stumbleupon, and, heck, even services like Blekko and Wolfram Alpha. Relying on just one search engine or source for your traffic is a dangerous strategy and not defensible in the whims of an algorithm update.

Once you have each of those pages, you want to make sure the URL is as search engine friendly as possible: short with 3-5 keywords in the URL and without parameters. While it’s a bit of overkill, providing the rel=”canonical” tag is a good idea as well.

Where things get a little tricky is inbound linking, email, social media links, and user agent detection. Whether someone is viewing the Ajax version of the content or the static version of the content, you should provide a “link to this page,” “share this page,” or “email this page” functionality, and that should always go to the static URL.

When users request those pages or come from a search engine and ask for the static URL page, you need to make a decision about how to serve that content. If the user agent is capable of working with Ajax/JavaScript, feel free to serve it that way. If it’s a bot or non compatible user agent (ie tablet, iPad, or mobile phone) then serve the HTML version. Lastly, I would always fail gracefully with a noscript tag that, when clicked, assures the users gets the content they really want.

While this may seem like a bit of double work, if you use Ajax properly, it’s probably not. You pull the same information from the same database–it’s only the method of rendering that changes. Flash, on the other hand, will be a bit more problematic, and would probably require a bit of double work. Therefore, it’s not a method I recommend. One of the primary reasons it’s a good idea to pull the data from same DB is it insures you don’t create a “bad cloaking” situation. Technically, cloaking is serving different content to the spiders and to the engines. If the actual content is the same, and it’s just the delivery technology and implementation that is the only difference, you have a low risk, highly defensible position. Especially if you use the canonical tag to nudge the spiders in the direction of the real URL.

Once you have the static URL in place, you need to provide a method for the search engines to see and access that content. You can use HTML sitemaps and XML sitemaps, but ideally you need to set up dedicated crawling paths. Unless your site is very small (less than a few hundred pages), I would suggest a limited test first. You should roll this out in phases on non mission critical sections of pages first. Use text browsers, text viewers, crawlers like Xenu link sleuth, or website auditor. Lastly, I would suggest setting up a monitoring page for use with services like change detection and/or Google alerts. It’s important that you know if something “breaks” or “jumps the rails” within 24 hours, not 30 days later when 70% of your content has dropped out of the index.

The last issue you want to consider is internal duplicate content. It’s not entirely unlikely that if the “Ajax crawling bot” finds its way to your pages, you don’t want them to be interested in it and index the content in that format. Using the rel=”canonical” tag that points to a static non-ajax URL will help, but I’ d also suggest the noindex, follow meta tags on the Ajax pages, just to be safe. Leaving things open to search engines to decide is where problems come from … sometimes BIG and EXPENSIVE problems …

So what are the takeaways from this post:

  • Ajax isn’t evil, but the implementation is going to be more difficult and complex, so be smart about how you do it
  • Province distinct static unique URLs that are accessible from the Ajax pages
  • Use user detection to serve the best version OF THE SAME content
  • Use spider simulators to insure you are calling the right version
  • Use change detection and monitoring to detect problems with indexing quickly and correct them before your website falls off the map.

photo credit: Shutterstock/Serg Zastavkin

tla starter kit

Related posts:

  1. Shopping Cart SEO Tips When you run an online ecommerce store with a shopping...
  2. Why Advertisers Love Flash and Ajax, and Why it’s Really Stupid Steve Rubel has a smashingly good bit of conversation bait...
  3. SEO Tips I Learned from Matt Cutts As an SEO one of the things we often have...
  4. Tips for Buying Old Sites Today’s post is a question from Max Capener, who wants...
  5. Tips for Controlling the Top 10 While this strategy is never going to work for anything...

Advertisers:

  1. Text Link Ads - New customers can get $100 in free text links.
  2. BOTW.org - Get a premier listing in the internet's oldest directory.
  3. Ezilon.com Regional Directory - Check to see if your website is listed!
  4. Need an SEO Audit for your website, look at my SEO Consulting Services
  5. Link Building- Backlink Build offers customized link building services
  6. Directory Journal - Get permanent deep links in a search engine friendly directory
  7. LinkWheel SEO - Get Web 2.0 Backlinks
  8. RevSEO High PR BackLinks- Private High PageRank Homepage Link Network
  9. The #1 ranking SEO software toolkit: get your free download
  10. TigerTech - Great Web Hosting service at a great price.
  11. Article-Writing-services.org - Article Writing Services creates quality content for websites and blogs at no cost to site owners.
  12. Rouper.com - Buy & Sell Premium Websites
  13. High Quality, Contextually Relevant Link Building. Join 500+ top agencies & publishers who rely on The HOTH every day. US based, enterprise-grade support. 100% money back guarantee. Try it now.

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis Wordpress Theme review.

Tips for Ajax for SEO

Niciun comentariu:

Trimiteți un comentariu