joi, 23 august 2012

Hiring 125,000 Veterans and Military Spouses

The White House

Your Daily Snapshot for
Thursday, August 23, 2012

 

Hiring 125,000 Veterans and Military Spouses

Yesterday First Lady Michelle Obama spoke at Naval Station Mayport near Jacksonville, Florida for a big announcement: In the last year, 2,000 companies had hired or trained an amazing 125,000 veterans and military spouses through Joining Forces.

Learn more about the announcement and what we're doing for our military families.

Check out the infographic and learn more about what we're doing for our military families.

In Case You Missed It

Here are some of the top stories from the White House blog:

Today: Office Hours with CTO Todd Park
This morning, the White House launched the Presidential Innovation Fellows program. Today at 5 p.m. EDT, CTO Todd Park will host a special session of White House Office Hours on Twitter. Find out how to engage.

Building-Blocks of a 21st Century Digital Government
Federal agencies are making great strides towards putting a solid foundation for a 21st Century Digital Government in place.

Recreation.gov Redesign: Engaging Visitors with the Great Outdoors
With a few weeks of summer weather left, there is still plenty of time to explore the great outdoors. There are few better places to do that than in America’s national parks, wildlife refuges, waterways, and forests. A new and improved Recreation.gov website is the perfect tool to plan your family’s next adventure.

Today's Schedule

All times are Eastern Daylight Time (EDT).

12:00 PM: Briefing by Press Secretary Jay Carney WhiteHouse.gov/live

12:05 PM: The President and the Vice President receive the Presidential Daily Briefing

WhiteHouse.gov/live Indicates that the event will be live-streamed on WhiteHouse.gov/Live

Get Updates

Sign up for the Daily Snapshot

Stay Connected

This email was sent to e0nstar1.blog@gmail.com
Manage Subscriptions for e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Unsubscribe | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111

 

Do Improved Social Signals Cause Improved Rankings?

Do Improved Social Signals Cause Improved Rankings?


Do Improved Social Signals Cause Improved Rankings?

Posted: 22 Aug 2012 07:57 PM PDT

Posted by willcritchlow

Everyone in search is by now aware that certain social signals are well-correlated with rankings.

In each major study published on the subject, the authors point to how correlation does not imply causation (see, for example SEOmoz and Searchmetrics). Dr. Pete even wrote a whole post on the subject.

I wanted to see if it was actually plausible for these correlations to arise without social signals being a direct ranking factor. I built some Excel models to test this out and see if I could build a model that achieved the observed correlations without assuming social signals as a ranking factor.


The punchline: it's possible there is no causation

I have a suspicion that this could be the most misinterpreted post I have ever written, so I thought I'd start with a prominent "Cliff notes" to be explicitly clear about what I am saying and more importantly what I am not saying.

I am saying

You can tweet any of the following without misrepresenting me:

  • Social signals *may* be correlated with better rankings but not cause them [tweet this]
  • Facebook Likes and rankings could achieve high correlation without Likes being a ranking factor [tweet this]

I am not saying

If you tweet any of the following attributed to me, I will write "does not follow instructions" on your forehead in magic marker:

What is this based on?

I have built a simplified Excel model of how pages accrue Likes over time. With no assumption of them being a ranking factor, I nevertheless demonstrate that we could see a strong correlation between Likes and ranking position.

Why focus on Likes?

The modelling works equally well with any of the social signals. I simply chose Likes to make the example more concrete - you could build the exact some correlation model with Tweets, Facebook Shares, Google +1s, or any other signal where accruing more social shares makes it even more likely that you will accrue more in the future.


Starting at the beginning

Every time we see a correlation study, I see evidence that some people haven't completely taken on board the correlation/causation subtleties. This is unsurprising - the mathematics behind the calculations in these study is typically undergraduate level (with some of the advanced analysis verging on graduate level) - most people's intuition lets them down horribly when confronted with probability and statistics. (Don't believe me? Check out the Monty Hall Problem).

So let's start from the beginning:

What are these studies looking for?

When we say correlation in this context, you can imagine that what we are looking for is similarity. We are looking for evidence that two things happen together (and don't happen together).

In the context of these studies, we are typically looking to see if "ranking well" happens together with "strong social signals."

Now - the mathematical part comes in when we try to define "happens together with" properly. The human brain is a remarkably powerful pattern matching device. For example - how many sportsmen and women have a pre-game routine involving a specific pair of lucky socks because of a sequence of events something like:

  • Wore a new pair of socks today. Kicked ass.
  • Wore the same pair of socks as last week. Kicked ass.
  • New socks in the wash. Grabbed a different pair. Got whupped.
  • Socks successfully cleaned and dried. Kicked ass again.

Pretty compelling evidence for those socks, huh?

From that point onwards, the athlete refuses to surrender the lucky socks. Any future losses are attributed to other factors ("I did everything I could - I even wore my lucky socks").

Superstitious athletes

Michael Jordan apparently started wearing longer shorts to cover his UNC "lucky shorts"

But let's look at this a little more closely and skeptically. Are there any other explanations for this sequence of events? Imagine that the athlete in question is good - winning roughly 75% of his or her games on average. Imagine also that the socks are, in fact, not magic and that they have no impact on the result (shocking, I know). The odds that the single loss of a set of 4 games will coincide with a single wear of a different pair of socks is then: 0.75 x 0.75 x 0.25 x 0.75 = 0.11

In other words, roughly one in ten pairs of socks would randomly look this lucky.

Given all this evidence, most of us would probably chalk it up to chance (but keep wearing our lucky socks just in case).

Add to this the fact that we can't help but be always on the lookout for these patterns (it's just how our brains are wired) and it's unsurprising that there is always some pattern to be seen somewhere.

Given all of this, we apply pretty high standards of proof before stating that there is correlation [i.e. that two things tend to happen (or not) together]. This is measured with a "confidence" which is similar to the layman's definition but is measured in probabilities. We express our confidence in terms of "the probability that we would see a correlation at least this strong even if there were no underlying correlation." Statisticians typically talk in 95% or 99% confidence ranges (though note that a 95% confidence interval is still wrong one time in 20).

The ranking factor studies undertaken by SEOmoz and others have shown a non-zero correlation with high confidence. In other words, there is a correlation between certain social signals and higher rankings. I don't think anyone is seriously disputing that at this point.

Correlation is not causation

This tricky phrase gets wheeled out with every study. What does it mean?

It means that the mathematical techniques we have applied to be confident that there is a relationship between these two variables says nothing about whether one causes the other.

It's easy to think of correlations that are not causative. More ice creams are sold in months when more sun lotion is sold. Sun lotion sales don't cause ice cream sales and ice cream sales don't cause sun lotion sales. Both are caused by sunny days.

While measuring correlation is straightforward based solely on raw data, this is generally not sufficient to judge causation. This is especially true where neither variable is in your control (such as the sunshine example above). Measuring or understanding causation is a topic for another day.

The important thing to note is that the size of the correlation or the degree of confidence in the correlation have no bearing at all on the likelihood of a causal relationship.

This is one of the common misconceptions with the interaction between social signals and rankings - when people say things like:

"But the correlation is too high - social signals must be a ranking factor"

I'm afraid my response is

"I'm sorry to inform you that you have been taken in by unsupportable mathematics designed to prey on the gullible and the lonely."

Sheldon moment

Sorry for the Sheldon moment there

Seeking alternative explanations

I believe in a healthy skepticism when presented with bold evidence. I can see lots of arguments why search engines could view social signals as ranking factors (though at least in the case of tweets, I've long supported an algorithmic discounting of nofollow). For all the reasons outlined above, however, I'm not convinced we have seen real evidence that this is in fact what is happening.

Assuming we take the correlation studies at face value, there are three possible explanations:

1. Social signals are a ranking factor (and apparently a strong one at that)

This appears to be the hypothesis of Searchmetrics:

These findings come from a study by search and social analytics company Searchmetrics aimed at identifying the key factors that help web pages rank well in Google searches

From Searchmetrics (emphasis mine).

Social signals ranking factors

2. The causation goes the other way - ranking well results in better social signals

Although it's hard to know how strong this effect could be, it's easy to believe there is some kind of effect here. Just think about your searching/liking behaviour:

Carry out a search:

Search for [excel for seo]

Click on a link:

Mike Pantoliano's Excel for SEOs

Recommend the page:

Like the content

I only used this example because I know how disappointed Mike was when we had to move this page - while the redirect carried across much of the link equity, it reset the social signals - this content has been tweeted and Liked thousands of times. Sorry, Mike.

3. There is a hidden causal variable (some kind of "page quality" signal?) that causes better rankings and increased social signals

The research Dan Zarrella published here last week on the relationship between social signals and links indicates that this is a plausible explanation - since we see that there is a fairly strong relationship between the two. The challenge with this approach is that if we believe social signals' correlation with rankings comes entirely from their correlation with a real ranking factor, it's surprising that we often see a stronger correlation between rankings and social signals than with any other single factor:

Relationship between FB shares and links

Can the alternatives account for the observations?

Whenever this has come up in conversation, I've had people express doubts that #2 or #3 could be strong enough effects to give the results we see.

My intuition said that #2 could be. Mainly based off the fact that any effect that is there will compound over time under an assumption that "Likes beget Likes" which seems reasonable given the way that Facebook edgerank and visibility work. If we have compounding growth to magnify small effects, then over time we could see remarkable correlation appear from relatively small effects.

So I decided to see if I could build a plausible model of #2.

Imperfect models

What do I mean by a plausible model?

I mean that I'm going to simplify a whole raft of stuff from the real world (I'm going to think about a single SERP, for example, and I'm going to think only in time units of months). I'm going to attempt not to have these oversimplifications bias the answer in my favour. My default position (known as the "null hypothesis" in statistics and probability) is that these effects are not strong enough. I'm going to construct a model that biases towards that being true and see if I can still produce a strong enough effect.

Hacking the Excel

I built this model in Excel [warning: macros]. It's very hacky - just designed to find an answer rather than to be a robust model. It takes a set of simple assumptions (none of which include a causal link from Likes to rankings) - you can see these on the "Input" sheet - and you can substitute your own values if you would like to see the impact these have:

Model parameters

  • Top ranking pages get 400 visits / month from search (the model over-simplifies to think about a single keyword/SERP getting 1,000 searches a month - this is a proxy for all organic traffic to the page)
  • Organic traffic drops off through the ranking positions according to an averaged traffic distribution
  • Each website is labelled as doing "Facebook marketing" (whatever that entails exactly) with a 30% probability. Facebook marketing doubles the rate of "random" Like acquisition. <geeky details>Sites not doing FB marketing in the model accrue "random" Likes according to a Poisson distribution with a mean of 10</geeky details>
  • Likes --> more Likes at a rate of 3% (i.e. for every 100 Likes a page has, it'll get 3 more in the next month)
  • Traffic --> Likes at a rate of 1%

This is what the Poisson distributions look like for the geeks in the audience:

Poisson distribution

It creates a really simple time series of Likes for each page in each month. The model runs for 36 time periods. Each refresh of Excel runs a new scenario and results in a single Spearman rank correlation at the end of the time period. Spearman rank correlation is the same measurement tool used in the SEOmoz and Searchmetrics studies.

This is what the growth of Likes looks like for a single example run (note that the lines are not ordered by ranking position despite the fact that the ordering is correlated with ranking across many runs):

Growth rate by ranking position

I then ran the same model a hundred independent times to get a fair assessment of the correlation we could expect as a result of the simple assumptions above. (There is an embedded macro that does this for you if you would like to reproduce it - I'm not a macro expert - there's no doubt loads wrong with this):

Markov macro

I got a correlation of 0.44

This is actually higher than the correlation found in most studies I'm aware of and was based off my first pass of "finger in the air" assumptions. It's easy to tweak the assumptions to get way higher. I'd be interested in a discussion about realistic assumptions and/or flaws in the methodology.

Since there is definitively no causation in my model, unless someone can find a flaw in my method (a very real possibility - I'm a little rusty at this), I'm going to declare that it absolutely is possible for the factors we described above to be strong enough to result in the measured correlation without Likes causing better rankings directly. (Remember - you could build this exact same model applied to any of the social signals so this applies equally well to Tweets, Facebook Shares, Google +1s, etc.)

I'd love to hear if you think I've missed something or got something wrong in building my model.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Lost Your Google Reviews? Take A Proactive Stance!

Posted: 22 Aug 2012 03:14 AM PDT

Posted by MiriamEllis

Apart from a local business profile completely disappearing from Google's local index, few things cause more frustration and heartache than lost reviews. Unfortunately, lost reviews are one of the problems most frequently reported by local business owners, creating stormy weather in this whole region of Google land. Have you or your clients recently experienced an uptick in lost reviews? This post is for you!

Lost reviews are almost as old as Google's entrance into Local. As early as 2006, Local SEO extraordinaire, Mike Blumenthal, began writing about the important role reviews appeared to be playing in local rankings, and in 2007 about a sudden loss of reviews and suspicion of a review filter. This saga has continued month by month over the past 6 years, with countless reviews disappearing into a black hole and bewildered business owners begging for answers.

I now present to you: the #1 link you need to have if you've lost your reviews!

This will take you to a thread in the Google and Your Business Forum in which Mike Blumenthal is attempting to have all lost review complaints consolidated into one place. It is suspected that Google has either implemented a new filter or done an upgrade to an old one, causing many reviews (including totally legitimate ones) to be lost. Mike states:

"Let's consolidate the issues into this ONE huge post and see if we can get someone from Google to monitor all of these cases... If you are in, I will do what I can to get more Google eyes looking at this issue."

As a Top Contributor to the Google and Your Business Forum, Mike has the ability to communicate directly with Google staff, so this is the best place on the web to document your lost reviews in hopes of taking effective action. Fair warning: don't hold your breath on this. You may or may not see your reviews reappear, but at the very least you will be making yourself heard and signaling to Google the seriousness of the current issue, hopefully generating better future outcomes if not a return of specific reviews you have just lost. I'm encouraging you to take a proactive stance on an issue that has a genuine effect on millions of businesses. There are already more than 99 posts on that thread. Keep them coming!

Between August 6- 16, Google staffer Jade W. has provided several responses on the thread which I recommend you read in full. Consolidated, the responses include:

"Soliciting reviews is suspect behavior for our systems, so please please please make sure your reviews are legitimate and left by your customers of their own accord...The majority of the reviews cases that I have investigated from the forum and other channels are reviews being taken down for suspicious reviewing behavior...

It's fine if you reach out to customers to ask them to review, but I do not recommend that you do this in waves. If you want to reach out to legit customers and ask them to review, I recommend you contact them immediately after you have done business with them. ...In our ideas, the "ideal" review is by a customer who writes a review of a place completely by his or her own accord, on mobile during the experience or at home after. This would mimic the regular flow of the business. On the other hand, some SEO companies that resort to spam reviews to deliver "results" would exhibit different behavior."

As I understand it, Jade W. is indicating that the majority of reviews being removed are for 'suspicious behavior' and she mentions types of solicitations of reviews and also 'waves' of reviews. Local SEOs and business owners who have been following the review issue for years will almost certainly recall that Google has not only solicited reviews in the past, but also authorized the use of review stations in December 2011.

It would appear that if you followed Google's lead on this and ran a contest to solicit reviews (thereby generating a wave of incoming reviews) or set up a review station in your shop to solicit reviews, you could be in danger of losing those reviews. Not trying to be a smart aleck here, but I honestly don't believe I will be alone in seeing a bit of irony in this scenario.

If Google has now decided that legitimate reviews don't come in waves, I hope they will read the comment in the same forum thread from Top Contributor Linda Buquet of Catalyst eMarketing. Here is an excerpt:

"Here's a common example that I think often happens and is totally legit. This could be a local store, restaurant, Dentist, or whatever...Monthly email newsletter goes out. At bottom it says "Check out all our great reviews on Google and please leave us one if you have any feedback to share"...Then due to that newsletter going out to all customers, they may get a bunch of reviews all at once. Then next month another big rush."

I have to agree with Linda on this. There are so many instances that could generate waves of interest and, thus, waves of reviews. How about a blowout sale at a store, a special foodie event at a restaurant, or a high profile news piece on a local business? In the 17th century, Issac Newton and Christiaan Huygens had conflicting theories as to whether light is a wave or a particle. Eventually, the theory of wave-particle duality was postulated to support the idea that all particles have a wave-like nature and vice versa. I find this applicable to the review scenario. Legitimate reviews can come along one-by-one, but it's easy to think of lots of instances in which they could legitimately come in waves.

Honestly, what Newton and Huygens think, what you, and what I think isn't really this issue here. The issue depends on what Google is thinking and how their thoughts are going to directly affect your business' ability to hang onto the reviews you get. Right now, Google is indicating that they have become suspicious of waves. Good to know.

So, if you've taken step one of reporting your lost reviews on the Google thread linked to above, here's an interesting second step to take. I recommend reading Joy Hawkins' article on avoiding the review filter. Joy is careful to note that her test is small and that she is only sharing a theory, but that the usage of a mobile device when leaving a review apparently enabled Joy to get two formerly 'lost' reviews to appear. It would be great if some of the Local SEOs reading this article could take this ball and run a little further with it, creating a larger test. Can your clients solicit reviews that 'stick' by utilizing a mobile device for the transaction? I'd love to know!

For my third resource, I will again cite a Mike Blumenthal post (which he's actually published and then re-published because of the recent upsurge in lost reviews): What Should You Tell A Client When Google Loses Their Reviews- A 4 Part Plan. Two things are especially noteworthy in this post: 1) a warning to the auto industry that they are under keen scrutiny right now for heavy spamming, and 2) encouragement to copy any reviews that you do receive. In case they disappear in future, you'll have a copy and can post the 'lost' review to your website as a testimonial, salvaging at least some of the power you have to share with others your good name in your community.

My personal position in all this is one of empathy. I know you work hard for your reviews. More importantly, I know how hard you work to run a business that offers such excellent service that you generate good feelings amongst your customers. It's a genuine loss when documentation of your satisfaction rate vanishes through no fault of your own.

I understand that it can be hard for Google (or anyone) to tell whether a review is legitimately earned, or if money has changed hands in exchange for a positive false review, but I encourage Google to treat this issue with the seriousness it deserves. Every day, countless business owners are spending their valuable time trying to educate themselves about the Google system that has so much power over the fate of their businesses. They are taking this very seriously, and most are trying to play by the rules. This effort should be rewarded with transparency and fair treatment, and it's totally up to Google to provide this, since they are calling the shots.

If you've recently lost reviews, post the details in the Google and Your Business Forum thread linked to in this post.

In Sum

  • Be aware that Google appears to have become suspicious of waves of reviews. Getting reviews slowly may be good insurance against loss.
  • It's always been a best practice to diversify in your review gathering efforts. If you let your customers pick their favorite review platform rather than guiding them towards Google, you will end up with a more diverse profile. If this means reviews come to you through Google more slowly, it appears that this may actually be a good thing.
  • You might like to try soliciting reviews on mobile devices, per Joy Hawkins' experiment, to see if they 'stick.'
  • Know that you're not alone. Lots of businesses are experiencing lost reviews now, and if you're in the auto dealership industry, there appears to be heavy scrutiny going on.
  • Create a document on your computer to save any review you see come in. If it disappears tomorrow, you can still get some leverage out if by using the saved copy as a testimonial on your website.

I hope this article has empowered you to proactively participate in this issue and has brought you up-to-date on the latest news on this front. Leave your comments or other advice below!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

brightonSEO Sponsorship & Ticket Giveaway

brightonSEO Sponsorship & Ticket Giveaway

Link to SEOptimise » blog

brightonSEO Sponsorship & Ticket Giveaway

Posted: 22 Aug 2012 04:59 AM PDT

Image of Brighton SEO from http://www.brightonseo.com/

SEOptimise is really pleased to announce that we're sponsoring brightonSEO this year. Since we started in 2007 we have really enjoyed being part of the SEO community, and this sponsorship opportunity gives us a chance to give something small back in helping Kelvin keep the conference free.

As one of the main events in the UK SEO calendar, and now running twice a year, brightonSEO really is a must-attend event, which always delivers on both content and fun. The tickets for this years event went in an hour and considering that the number of tickets has nearly doubled, it's even more impressive.

From this year's event I wanted to highlight some of the things that I'm looking forward to (Kelvin did prompt me a little, but it's been good looking through what's happening):

Firstly the training workshops

Although I can't make it, the Microformats, Schema & Rich Snippets training from Richard Baxter is my highlight, with this area only set to become more important, this session should give you the information you need to start implementing, or improve your implementation of semantic markup.  Kelvin has also offered a 15% discount for my session selection so just use the code seoptimise15.

There is also the Advanced Google Analytics training by Dara Fitzgerald. With the plethora of online data and the ever increasing acquisition channels at the disposal of today's marketers, it's becoming increasingly important for channel practitioners to track, measure, and analyse the real value of their activities. I'm sure Dara's session will deliver some great content for those looking to get the most out of their Google analytics setup.

Finally, I have to mention Kevin's content marketing workshop. Some of you might know that Kevin used to work with us and I'm sure he will deliver some interesting insights and some very practical methodologies into content marketing and how you could develop a comprehensive content marketing strategy.

The Conference

For the conference some highlights that I'm looking forward to include Richard Baxter (again!) with 'How to be a better SEO'. We work very hard at SEOptimise on personal development for all staff as, in an industry that is constantly evolving, it's essential to embrace life-long learning, so I'm hoping to gain some insight into how Richard does this too.

Rebecca Weeks' talk on 'Chasing the algorithm' is another one I'm looking forward to. It's a subject that does get covered and we all know that long-term strategies are the way forward, but Rebecca is looking at this from a slightly different angle and including a case study, which will, I'm sure, offer great insight in how to balance budget, client expectations and results.

Berian Reed from Auto Trader's talk on what he's learnt from working on a site the size of Auto Trader and how he has dealt with the recent algorithm updates will certainly be extremely useful. I'm looking forward to hearing how Berian formulated, implemented and took steps to 'future proof' their SEO strategy.

Tom Anthony's API talk will be interesting; the use of API's is an area that seems to be constantly moving forward, and I'm anticipating some innovative and different ideas for the use of API's.

I'm also looking forward to 7 things you need to know about Mobile SEO by Aleyda Solis. Everyone in the office who has heard Aleyda speak mention the great content and delivery of her talks, so I'm anticipating another good one here.

I've only selected a couple of highlights, but I have to say that the line-up for this brightonSEO looks brilliant, and credit has to go to Kelvin for putting this together and keeping it free!

We also have 5 tickets to giveaway for brightonSEO, winners will be selected at random from those people who tweet/retweet this post – be sure to tag us (@seoptimise) in your tweets/retweets - or leave a comment that you would like to enter the competition. Winners will be notified by the 31st August. (Ts & Cs apply)

© SEOptimise - Download our free business guide to blogging whitepaper and sign-up for the SEOptimise monthly newsletter. brightonSEO Sponsorship & Ticket Giveaway

Related posts:

  1. BrightonSEO 2012 Interview with Kelvin Newman
  2. BrightonSEO 2011 Roundup: who said what and why
  3. Glenn Jones on Microformats and SEO – BrightonSEO