marți, 23 noiembrie 2010

SEOmoz Daily SEO Blog

SEOmoz Daily SEO Blog


Answers to the SEO Professional's Litmus Test

Posted: 23 Nov 2010 06:34 AM PST

Posted by randfish

 Yesterday I wrote a post: Ten Question Litmus Test for Professional SEOs. Today, as promised, I'm providing my answers to these questions. Note that these aren't the only way(s) to answer correctly, as some of the questions are more open-ended. In the answers, you'll also see citations to sources for the answers.

  1. Which is more likely to have a positive impact on a page's search engine rankings and why - 10 links from 1 website or 1 link each from 10 different websites?
    _
    All other things being equal, there appears to be a very strong correlation between higher rankings and a diversity of linking domains. Hence, earning 10 links from 10 unique sites should provide greater benefit. My opinion is that Google rewards this type of linking because diversity indicates both broad popularity/importance and greater editorial citation vs. a single site (possibly one which has a relationship with the linked site). The larger quantity of linking domains is also a far greater barrier for marketers and businesses to earn vs. the single site's links.
    _
    (References: Correlation of Rankings, All Links are Not Created Equal)
    _
  2. Explain the difference between the following items and how the search engines treat them - 301 response code, 302 response code, canonical URL tag and meta refresh.
    _
    A 301 redirect tells browsers and search engines that a page has been permanently redirected to a new URL. A 302 redirect indicates a temporary redirection that will change again or revert back in the future. Search engines such as Google and Bing interpret a 301 redirect by passing the link equity and ranking metrics from the 301'd URL to the target page. 302 redirects do not always receive this treatment (though exceptions exist) and may show in the search results with the original URL/snippet even after the 302 redirect is in place.
    _
    The Canonical URL tag is a < link rel> item in the header of a document that serves as a suggestion to search engines, indicating the "original" or "canonical" version of that page's content. It is intended to tell engines which URL is suitable for indexing when multiple pages contain the same or very similar content.
    _
    The meta refresh is a directive in the header of a document indicating that, after a certain quantity of time is passed, the browser should redirect to a new location (or reload the page). Search engines appear to treat most short meta refreshes (a few seconds in length) as permanent redirects, passing the link equity and ranking metrics to the target page (they also claim to do this 100% of the time for meta refreshes marked with "0" seconds of delay). Longer meta refreshes may be indexed as normal.
    _
    (References: SEO Advice: 302 Redirects, Canonical URL Tags, Google + Yahoo! w/ Meta Refreshes)
    _
  3. How can the meta robots tag impact how search engines crawl, index and display content on a web page?
    _
    The meta robots tag can be used to specify whether a page is included in a search engine's public index (for display in results), whether links on the page are followed vs. nofollowed and the display of the snippet (removing the description and/or excluding titles/descriptions from the Open Directory Project or Yahoo! Directory for example). The tag can also be used to prevent a cached version of the page being available via the search results.
    _
    (References: Meta Robots Tag 101)
    _
  4. Who are the top 2 search engines (as ranked by share of queries) in the following countries - the United States, United Kingdom, Russia and China?
    _
    In the United States, it's either Google & Bing or Google & YOUTube (though the latter is technically owned by Google and thus, could be considered a single entity. It's also true that YOUTube is a specialized site exclusively for video content, and thus may not "count" - this argument is often made in comparison to Twitter searches, many of which come through APIs). The UK matches the US on this front. In China, #1 is Baidu and Google is #2. In Russia, Yandex is #1 and Google is #2 (BTW, Yandex's English language results are pretty darn good, and have that nice, early Google minimalist feel to them).
    _
    (References: October 2010 US Search Engine Rankings, Hitwise UK Top Engines, iResearch data for China via Bloomberg, From Russia w/ Search Love)
    _
  5. Name at least 3 elements critical to ranking well in Google Local/Maps/Places search.
    _
    Any of the following would be reasonable answers - registration/verification with Google Local/Places; listing presence & consistency in Google Places sources; ratings and reviews from Google Places users; proximity  to centroid and match on local phone number/address; listing prominence in Google Places sources (e.g. Yelp, Citysearch, Urbanspoon, Dexknows, etc.), listings/references from sources that feature in "more about this place" (typically from local coverage websites); business listing title/name/domain name.
    _
    (References: Local Ranking FactorsRatings the New ReviewsGoogle Places Guidelines)
    _
  6. What aspects of social media marketing have a positive impact on search engine rankings (apart from the value of direct links from the social sites)?
    _
    Social media is a form of awareness marketing and branding, which can bring a wide variety of search engine ranking benefits. The most obvious and direct is the potential for creation of links and references to the sites/pages that garner traffic and attention through social media. Another powerful influence is the use of social results directly in SERPs as seen by Google & Bing's integrations with Twitter (and Bing's integration with Facebook). Google also has connections, often via Gmail and other services to "results from my social circle" which can bring results to page 1 that otherwise wouldn't appear). There's also indications that tweets, in particular, may be directly influencing rankings and being treated as links, particularly for queries that call the QDF algorithm. Finally, brand associations and mentions may be mined by the engines from social sources and used in brand entities or co-citation algorithms to help a site/page be seen as more relevant or related to particular keywords and more "important" or "popular."
    _
    (References: Use Twitter to Rank in 5 Minutes, Google's Tweet Ranking Algorithm, Facebook + Bing's Plans to Make Search Social, Google Social Circle Goes Live)
    _
  7. List 5 tags/locations on a page where employing a target keyword can have a positive effect on search engine rankings.
    _
    Any of the following would be reasonable answers: Title element, domain name, subdomain, URL string, body element, alt attribute, bold/strong tag.
    _
    (References: Perfecting KW Targeting + On-Page Optimization, Correlation of Google/Bing Rankings, Explaining Google's Algorithm w/ Math
    _
  8. Describe the distribution of search query demand and what is meant by the "fat head" and "long tail."
    _
    Billions of searches are performed each week, but the vast majority of these (~70%) are query terms/phrases that are searched for less than 10 times per month. This distribution of demand is represented by a chart with a trailing line called the "long tail" (as coined by Chris Andersen of Wired). The head of this curve, where the queries are searched thousands-millions of times each month (very popular terms) is called the "fat head."
    _
    Long Tail Data Segmented
    _
    (References: Hitwise Blog, Illustrating the Long Tail)
    _
  9. Name 6 tools/sources that will display a list of external URLs that link to a webpage.
    _
    Any of the following sources would be acceptable: Google Webmaster Tools, Bing Webmaster ToolsGoogle Link Command (despite its shoddiness), Yahoo! Site Explorer, Yahoo! Link Commands (only available via non-Bing powered Yahoo!s), SEOmoz Linkscape, Majestic SEO, Blekko, Exalead, Google Blog Search, Alexa, or a site's own web analytics/log files.
    _
    (NOTE: Although there are many other tools in the SEO realm that contain link data, all of them (to my knowledge) are powered by one or more of the above sources.)
    _
  10. What are some ways to positively influence the ratio of pages a search engine will crawl and index on a website?
    _
    There are a large number of potential ways to answer this question, so experienced SEOs will have to use their judgement about the answers given by others, but here are a few of the most obvious/sensible ones:
    _
    A) Reduce the quantity of low quality, low value and/or low unique-content pages. B) Add and verify an XML Sitemap to send URL information to the engines. C) Produce RSS feeds of pages/sections that frequently update with new content and use ping services to alert engines of changes/additions. D) Reduce the click-depth required to reach pages on the site. E) Eliminate confusing navigation and architecture such as high quantities of pagination, large numbers of faceted navigation or multiple versions of categorization/organization hierarchies. F) Reduce or eliminate duplicate content (or leverage solutions such as rel=canonical tags). E) Earn more links (or tweets possibly) to pages that are being passed over for crawling/indexing.
    _
    (References: Diagrams for Solving Crawl Priority, An Illustrated Guide to Matt Cutts Comments on Crawling + Indexation, Testing How Crawl Priority Works, Google's Indexation Cap, Crawling + Indexing: Not as Simple as Just In or Out, Solving Indexation Problems)
    _
  11. BONUS! Describe the concept of topic modeling and how modern search engines might use it to improve the quality of their results.
    _
    Topic modeling is a way for search engines to mathematically resolve the relationships between words and phrases and help determine if a set of content is relevant to a query. These systems typically leverage a vector space model in which the degree of "related-ness" is represented with an angle or "cosine similarity" - smaller angles are more similar while larger ones are further away. Many tens or hundreds of thousands of dimensions may be necessary to accurately represent a corpus' collection of "topics" and the words/phrases that are related. Specific algorithms like LDA, LSI, LSA, pLSI, etc. are all forms of implementation of topic modeling.
    _
    (References: Google Tech Talk: Topic Modeling, Topic + Keyword Re-Ranking for LDA-Based Topic Modeling, LDA + Google's Rankings)
    _

Eventually, we'll be replacing our old SEO Expert Quiz with something more like this. I also saw a number of requests for certification from SEOmoz - it's something we've talked about, but had lots of concerns with (structure, logistics, policing, liability, etc.), but I'd love to hear your thoughts on this - is it a service you'd like to see from us? Is MarketMotive or another vendor doing a solid job here already?


Do you like this post? Yes No

Segmenting Social Traffic in Google Analytics

Posted: 22 Nov 2010 04:37 PM PST

Posted by randfish

 If you use Google Analytics, you've undoubtedly seen a report like this:

Google Analytics Pie Chart

The problem is, there's no breakdown of "social media" in this view of traffic sources, and with the dramatic rise of social media marketing, marketers need an easy way to segment and "see" this traffic separately from the rest of their referrers. We know it's mixed in with "referring sites" and "direct traffic" but luckily, there's a way to extract that data in just a few simple steps.

Step 1: Create a Custom Segment

Custom segments are the way to go for separating traffic into filter-able buckets for deeper analysis. GA makes this fairly painless:

Step 1

From any of the "Traffic Sources" sections, just click the "Advanced Segments" in the upper-right hand corner and then the link to "Create a new advanced segment."

Step 2: Add Social Sources

This is the most crucial part, and requires that you have a full list of the sites/words to include. I don't recommend using just the domain names or URLs of the most popular social sites, but instead, some clever "catch-all" words using the "source" condition, as shown below:

Step 2

Make sure to continue adding "OR" statements, not "and" statements - the latter will require that both conditions are met vs. any one of the "ORs". Here's the list of words I used, though you can certainly feel free to add to it:

  • twitter
  • tweet
  • facebook
  • linkedin
  • youtube
  • reddit
  • digg
  • delicious
  • stumbleupon
  • ycombinator
  • flickr
  • myspace
  • hootsuite
  • popurls
  • wikipedia

Depending on your niche, it might be valuable to run through your top 2-500 referring domains looking for any obvious matches. You could also refer to Wikipedia's list of popular social sites.

Step 3: Test & Name Your Segment

In order to create a fully functional segment, you'll want to test the logic you've created to be sure results are returning. Before you do that, though, GA requires naming your segment (I used "social media"):

Step 3

Once it's complete and working properly, click "save segment." You'll be returned to the prior screen with the segment ready to rumble.

Step 4: Filter Traffic by "Social Media"

Your new segment is ready to be applied. You can now filter social media exclusively or see it in comparison to other traffic sources on any report in GA. Just use the advanced segments drop-down and choose "social media" under the custom segments list like so:

Of course, just having data is useless unless there's some action you can take from it. Segmenting social traffic is useful for reporting, particularly to gauge value (if you have action tracking on commercial activities set up in GA, for example) and see growth/impact over time. But, there's more you can learn than just raw traffic and conversions numbers.

Here's some examples of reports I ran, along with the value/intelligence extracted from the data:

It can be tough to "see" the social sites between other referring domains, but once they're broken out, combing through and finding the sites where your efforts are working is vastly more simple. If you then compare this against traffic "opportunity" from these sites (using a combination of traffic data and gut check), you'll be able to find which sites have the greatest chance to improve. For SEOmoz, Facebook, LinkedIn, Reddit and Wikipedia stand out to me as places where we likely have more opportunity than we're currently capturing.

This next chart compares search vs. social traffic over time:

If I'm looking to evaluate progress and make comparisons, this view is fairly useful. I can tell if my share of social media is growing or shrinking and how it compares to overall traffic and search individually. I'm only looking at a short timeframe here, but over the course of weeks or months, I can quickly gauge whether my efforts in social are paying off with traffic and whether they're improving my performance in search engines (through new links, citations, etc). When someone asks if social helps search, showing these two segments over time can be persuasive.

Next, I'm reviewing the level of engagement of social media visitors:

At first, I can compare this against other segments (like "search" or "direct") as a measure of comparative value. But, I also want to compare this over time, particularly if I'm making tweaks to my site to encourage greater engagement and click-through to see if those efforts are successful.

Just because I'm curious, I'll check out some browser stats:

 

Admittedly, this isn't especially actionable, but it is fascinating to see the browser "savvy" of social users. Dominated by Firefox and Chrome with very little Internet Explorer use. If I'm trying to see what the cutting edge users are shifting towards, this is where to look. I suspect Rockmelt will soon be joining the list. (BTW - I love that 5 people came with the user-agent "Googlebot" - awesome).

Last, let's peek at the pages social visitors see:

These are all potential opportunities to create more customized landing experiences based on the referrer path, and the report can also give me insight about what content I need to continue producing if I want to draw in more social traffic. 


If social media marketing is a focus of your organization, segmenting that traffic in reporting is critical to determining the value of your efforts and improving. So get into GA, segment, and start seeing your traffic for what it really is. 

p.s. Himanshu wrote an excellent post on a very similar topic, showing even more ways to splice and dice the data (and make some cool charts in Excel) on YOUmoz.


Do you like this post? Yes No

Seth's Blog : The full day publishing seminar

[You're getting this note because you subscribed to Seth Godin's blog.]

The full day publishing seminar

Book publishing is in the throes of serious change, from format to content to marketing. Since my first book in 1986, I've been thinking about this--as a writer, a self-publisher, an ebook creator and as a marketer. I've probably had my hands on 200 books or booklike projects over the last twenty-five years, and I've learned a lot.

For the first time, I'm running a seminar to talk about it. This is a day, at the fabulous Helen Mills Theater in New York City, to understand how effective book publishing works starting now. I'll talk about what's worked and what hasn't, describe my vision for how an asset can be built going forward, and most of all, interact with you about your projects and opportunities.

Because everyone in the room has a similar agenda, we'll be able to focus really closely on how the new marketing and the changes in our world are going to impact our industry.

The day is created with writers, editors, agents and publishers in mind. I believe now more than ever that a book has a significant impact, that it can change minds and that it can be part of a useful business model as well.

If you'd like to come, please sign up as soon as you can, because there are fewer than 100 seats. Use discount code "pilgrim" to save 25% if you get in before this Thursday.

  • Email to a friend

More Recent Articles

Don't want to get this email anymore? Click the link below to unsubscribe.


Click here to safely unsubscribe now from "Seth's Blog" or change your subscription, view mailing archives or subscribe

Your requested content delivery powered by FeedBlitz, LLC, 9 Thoreau Way, Sudbury, MA 01776, USA. +1.978.776.9498

 

Tuesday Talk: Health Care Reform with Nancy-Ann DeParle

The White House Your Daily Snapshot for
Tuesday Nov. 23,  2010
 

Tuesday Talk: Health Care Reform with Nancy-Ann DeParle

Nancy-Ann DeParle, Director of the White House Office of Health Reform, will be answering your questions on health reform implementation in a live video chat today at 2:15 p.m. EST.

Submit your questions and watch live.

Photo of the Day

First Lady Michelle Obama joins students for a Let's Move! Salad Bars to Schools launch event at Riverside Elementary School in Miami, Fla., Nov. 22, 2010. (Official White House Photo by Chuck Kennedy)

In Case You Missed It

Here are some of the top stories from the White House blog.

What They’re Saying: New Health Care Rules Protect Consumers
Read what leaders are saying about the new medical loss ratio rules that give consumers more value for their health care premium dollar.

White House White Board: Your Health Care Dollars
Nancy-Ann DeParle, Director of the Office of Health Reform at the White House, explains a new provision of the Affordable Care Act in the latest edition of the White House White Board video series.

The Return of First Question
Press Secretary Robert Gibbs takes questions from Twitter on Don't Ask Don't Tell and the New START Treaty as part of our ongoing "First Question" series.

Today's Schedule

Today the President and the Vice President will travel to Kokomo, Indiana as part of the White House to Main Street tour. In the afternoon, they will visit the Chrysler Indiana Transmission Plant II where they will take a tour, greet plant workers, and deliver brief remarks.

All times are Eastern Standard Time.

9:15 AM: The President receives the Presidential Daily Briefing

9:55 AM: The President departs the White House en route Andrews Air Force Base

10:10 AM: The President departs Andrews Air Force Base en route Peru, Indiana

11:45 AM: The President arrives in Peru, Indiana

1:20 PM: The President and the Vice President tour the Chrysler Indiana Transmission Plant II in Kokomo, Indiana

1:35 PM: The President and the Vice President deliver remarks to workers WhiteHouse.gov/live

2:15 PM: Tuesday Talks: Nancy-Ann DeParle WhiteHouse.gov/live

3:45 PM: The President departs Peru, Indiana en route Andrews Air Force Base

5:10 PM: The President arrives at Andrews Air Force Base

5:25 PM: The President arrives at the White House

WhiteHouse.gov/live  Indicates Events that will be livestreamed on WhiteHouse.gov/live.

Get Updates

Sign Up for the Daily Snapshot 

Stay Connected

 

 
 
This email was sent to e0nstar1.blog@gmail.com
Manage Subscriptions for e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Unsubscribe e0nstar1.blog@gmail.com | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111 
 
 
  

 

SEOptimise

SEOptimise


Track Google Site Preview Bot in Google Analytics

Posted: 22 Nov 2010 05:11 AM PST

Rasmus Himmelstrup (@rasmusgi) tweeted a link to a post on Web Analytics Island about how to track the Google Site Preview bot in SiteCatalyst. The tracking relies on the face that Google site preview shows a rendered image of the page so it must execute any javascript on the page. If Google are executing the Site Catalyst code then maybe they are also running the Google Analytics code. I want to track this thing. Let’s get to work!

Kevin at Web Analytics Island uses the user agent string

Mozilla/5.0 (en-us) ApplewebKit/525.13 (KHTML, like Gecko; google web preview) Version/3.1 Safari/525.13

to identify the Google bot, but this information (as a complete string) is not available in Google Analytics. But I could check that visits using the Google bot version of safari increased after the introduction of Site Preview:

This is kind of what I wanted to see.

I made an advanced segment to contain this data:

Looking at the “Service Providers” report shows that this segment is definitely interesting, even if it is not tracking the Google Site Preview Bot:

© SEOptimise – Download our free business guide to blogging whitepaper and sign-up for the SEOptimise monthly newsletter. Track Google Site Preview Bot in Google Analytics

Related posts:

  1. 6 Personas that Google Analytics can’t Track
  2. Finding Errors in Your Google Analytics Tracking Code
  3. 30+ More Google Analytics Tools, Apps, Hacks, Tweaks and Other Resources

Seth's Blog : Reasons to work

[You're getting this note because you subscribed to Seth Godin's blog.]

Reasons to work

  1. For the money
  2. To be challenged
  3. For the pleasure/calling of doing the work
  4. For the impact it makes on the world
  5. For the reputation you build in the community
  6. To solve interesting problems
  7. To be part of a group and to experience the mission
  8. To be appreciated

Why do we always focus on the first? Why do we advertise jobs or promotions as being generic on items 2 through 8 and differentiated only by #1?

In fact, unless you're a drug kingpin or a Wall Street trader, my guess is that the other factors are at work every time you think about your work. (PS Happy Birthday Corey.)

  • Email to a friend

More Recent Articles

Don't want to get this email anymore? Click the link below to unsubscribe.


Click here to safely unsubscribe now from "Seth's Blog" or change your subscription, view mailing archives or subscribe

Your requested content delivery powered by FeedBlitz, LLC, 9 Thoreau Way, Sudbury, MA 01776, USA. +1.978.776.9498