vineri, 19 august 2011

Damn Cool Pics

Damn Cool Pics


Horsemaning From Around The World

Posted: 19 Aug 2011 02:51 PM PDT

Originated in America, horsemaning, more and more people around the world now adopt the new entertainment activity as a Horsemaning which is the same sh*t as planking, owling etc but it looks more fun. Here are the most weird and shocking pics of horsemaning from all around the world.

Related Post:
Horsemaning is the New Planking

In Paris


In Belgium


In Budapest


In Holland


In Melbourne, Australia


In Turkey


In Thailand


Still In Thailand


At The Nike Store


In Fischamend, Austria


On Google+


At Cold Stone Cremery


At The Exotic Huffington Post Offices (NYC)


At The MTV Offices (Times Square, NYC)


At A Deli (Location Unknown)


On The Beach (South Of France)


On The Radio


In The Style Of The 18th Century


At NBC Studios (NYC)


At BuzzFeed HQ (Soho, NYC)


On The Planet Kamino?


In A Basement, While Playing Call Of Duty: Black Ops (Location Unknown)


In A Pool (Location Unknown)


In An Office…With A Giant Knife


Rural Nebraska?


American People Vs. Japanese People

Posted: 19 Aug 2011 02:46 PM PDT

Fun comic introducing some curious differences between the Japanese and the Americans. I'm sure it was created by someone from Japan.

Click on Image to Enlarge.

Source: 9gag


SEOmoz Daily SEO Blog

SEOmoz Daily SEO Blog


Introducing SEOmoz PRO Perks

Posted: 19 Aug 2011 12:06 AM PDT

Posted by AndrewDumont

Howdy, SEOmoz family! I'm Andrew, the new Business Development Director over at SEOmoz, and this is my first post on the blog. Great relationships are all about kicking things off on the right foot... or at least that's what I hear. That in mind, SEOmoz PRO members, we've got a nice little treat for you. 

If you've spent any time poking around the PRO Dashboard, you've probably run into something called the "PRO Member Discount Store." Sound familiar? Initially, the idea was to pack it full of amazing discounts on web services we loved, exclusively for PRO members. Due to a lack of vetting on our part, the store quickly turned into a catch all that didn't provide much value at all, and that ain't right. We figured it was time to revive that property, with its oh so good intent.

But before we get into the unveiling, we first wanted to talk about goals. In rethinking the discount store, we knew we had to keep the following principles at the core:

  • Quality - With the previous store, we lost track of the most important thing; quality. For service discounts to provide value, they need to be 1) competitive and 2) available for services we know and love. In the new store, you'll notice the quality in services and discounts the moment you hit the page. 
  • Exclusivity - Without focusing on exclusivity, we weren't able to offer up the type of competitive discounts we knew you wanted. In the new store, we cut back on the number of services, and in turn increased the size of discounts offered. 
  • Simplicity - When you're on the verge of a great deal, you want to take advantage of it right away. In the new store, we removed all of the guess work, making the path to coupon redemption simple, and only one step away.

With that, ladies and gents, we give you SEOmoz PRO Perks.

Jammed packed with some of the hottest services on the web, you'll quickly see why we're calling them perks. We went out and talked to our favorite tools in the social monitoring, affiliate management, customer service and on page optimization space, to bring you some serious heat right off the bat. To provide a few highlights, here are some of our current perks.  

  • 50% Off Unbounce
  • 50% Off Geckboard
  • 40% Off Invoicera 
  • 40% Off WooThemes
  • 30% Off SnapEngage
  • 30% Off ViralHeat
  • 20% Off Wistia 
  • 20% Off Get Satisfaction
  • And much more...

Piqued your interest? Head on over to SEOmoz's new PRO Perks store, and indulge as you wish -- being an SEOmoz PRO definitely has its perks. If you're still not an SEOmoz PRO, there's no better time than the present to give our 30-day free trial a go. 

Have some ideas for other services you'd like to see in our PRO Perks store? Drop me a line at andrew[at]seomoz.org, or let us know in the comment section below. We want to make sure that our PRO Perks are consistenly bringing that hotness. 


Do you like this post? Yes No

Statistics: Don't Make These Mistakes - Whiteboard Friday

Posted: 18 Aug 2011 02:05 PM PDT

Posted by Aaron Wheeler

 Statistics can be very powerful tools for SEOs, but it can be hard to extract the right information from them. People understandably make a lot of mistakes in the process of taking raw data and turning it into actionable numbers. If you know what potential mistakes can be made along the way, you're less likely to make them yourself - so listen up! Will Critchlow, the co-founder and chief strategist at Distilled, has a few tips on how to use valid and sound statistics reliably. Use a lot of statistics yourself? Let us know what you've learned in the comments below!

 

Video Transcription

Howdy, Whiteboard fans. Different look this week. Will Critchlow from Distilled. I am going to be talking to you about some ways you can avoid some common statistical errors.

I am a huge fan of the power of statistics. I studied it. I have forgotten most of the technical details, but I use it in my work. We use it a lot at Distilled. But it is so easy to make really easy to avoid mistakes. Most of it comes from the natural way that humans aren't really very good at dealing with numbers generally, but statistics and probability in particular.

The example I like to use to illustrate that is imagine that we have a disease that we are testing for, a very rare disease, suppose. So, we have a population of people, and some small, some tiny proportion of these people have this rare disease. There is just this tiny, tiny sliver at the top. Maybe that is 1 in 10,000 people, something like that. We have a test that is 99% accurate at diagnosing when people have or don't have this disease. That sounds very accurate, right? It's only wrong 1 time in 100. But let's see what happens.

We run this test, and out of the main body of the population, I am going to exaggerate slightly, most people are correctly diagnosed as not having the disease. But 1% of them, this bit here, are incorrectly diagnosed as having the disease. That's the 99% correct but 1% incorrect. Then we have the tiny sliver at the top, which is a very small number of people, and again 99% correct, a small percent are incorrectly told they don't have it. Then, if we just look at this bit in here, zoom in on there, what we see is actually of all the people who are diagnosed as having this disease, more of them don't have it than do. Counterintuitive, right? That's come from the fact that, yes, our test is 99% accurate, but that still means it is wrong 1 in 100 times, and we're actually saying it is only 1 in 10,000 people who have this disease. So, if you are diagnosed as having it, it is actually more likely that is an error in the diagnosis than that you actually have this very rare disease. But we get this wrong. Intuitively people, generally, everyone would be likely to get this kind of question wrong. Just one example of many.

Some things that may not be immediately intuitively obvious, but if you are working with statistics, you should bear in mind.

Number one, independence is very, very important. If I toss a coin 100 times and get 100 heads, then if those were independent coin flips, there is something very, very odd going on there. If that is a coin that has two heads on it, in other words, they're not in fact independent, the chance of me getting a head is the same on every one, then they're completely different results. So, make sure that whatever it is you are testing, if you are expecting to do analysis over the whole set of trials, that the results are actually independent. The common ways this falls down are when you are dealing with people, humans.

If you want reproducible results, if you accidently manage to include the same person multiple times, their answers to a questionnaire, for example, will be skewed the second time they answer it if they have already seen the site previously, if you are doing user testing or those kinds of things. Be very careful to set up your trials, whatever it is that you are testing, for independence. Don't over worry about this, but realize that it is a potential problem. One of the things we test a lot is display copy on PPC ads. Here you can't really control who is seeing those, but just realize there is not a pure analysis going on there because many of the same people come back to a site regularly and have therefore seen the ad day after day. So there is a skew, a lack of independence.

On a similar note, all kinds of repetition can be problematic, which is unfortunate because repetition is kind of at the heart of any kind of statistical testing. You need to do things multiple times to see how they pan out. The thing I am talking about here particularly is you often have seen confidence intervals given. You have seen situations where somebody says we're 95% sure that advert one is better than advert two or that this copy converts better than that copy or that putting the checkout button here converts better than putting the checkout button there. That 95% number is coming from a statistical test. What it is saying is, it is assuming a whole bunch of independence of the trials, but it is essentially saying the chance of getting this extreme a difference in results by chance if these two things were identical is less than 5%. In other words, fewer than 1 in 20 times would this situation arise by chance.

Now, the problem is that we tend to run lots of these tests in parallel or sequentially. It doesn't really matter. So, imagine you are doing conversion rate optimization testing and you tweak 20 things one after another. Each time you test it against this model and you say, first of all I am going to change the button from red to green. Then I am going to change the copy that is on the button. Then I am going to change the copy that is near the button. Then I am going to change some other thing. You just keep going down the tree. Each time it comes back saying, no, that made no difference, or statistically insignificant difference. No, that made no difference. No, that made no difference. You get that 15 times, say. On the 16th time, you get a result that says, yes, that made a difference. We are 95% sure that made a difference. But think about what that is actually saying. That is saying the chance of this having happened randomly, where the two things you are testing between are actually identical, is 1 in 20. Now, we might expect something that would happen 1 in 20 times possibly to come up by the 16th time. There is nothing unusual about that. So actually, our test is flawed. All we've shown is we just waited long enough for some random occurrence to take place, which would have happened definitely at some point. So, you actually have to be much more careful if you are doing those kinds of trials.

One thing that works very well, which scuppers a lot of these things, is be very, very careful of this kind of thing. If you run these trials sequentially and you get a result like that, don't go and tell your boss right then. Okay? I've made this mistake with a client, rather than the boss. Don't get excited immediately because all you may be seeing is what I was just talking about. The fact that you run these trials often enough and occasionally you are going to find one that looks odd just through chance. Stop. Rerun that trial. If it comes up again as statistically significant, you are now happy. Now you can go and whoop and holler, ring the bell, jump and shout, and tell your boss or your clients. Until that point, you shouldn't because what we very often see is a situation where you get this likelihood of A being better than B, say, and we're like we're 95% sure here. You go and tell your boss. By the time you get back to your desk, it has dropped a little bit. You're like, "Oh, um, I'll show in a second." By the time he comes back over, it has dropped a little bit more. Actually, by the time it has been running for another day or two, that has actually dropped below 50% and you're not even sure of anything anymore. That's what you need to be very careful of. So, rerun those things.

Kind of similar, don't train on your sample data. If you are looking for a correlation between, or suppose you're trying to model search ranking factors, for example. You're going to take a whole bunch of things that might influence ranking, see which ones do, and then try to predict some other rankings. If you get 100 rankings, you train a model on those rankings, and then you try and predict those same rankings, you might do really well, because if you have enough variables in your model, it will actually predict it perfectly because it has just learned, it has effectively remembered those rankings. You need to not do that. I have actually made this mistake with a little thing that was trying to model the stock market. I was, like, yes, I am going to be rich. But, in fact, all it could do was predict stock market movements it had already seen, which it turns out isn't quite as useful and doesn't make you rich. So, don't train on your sample data. Train on a set of data over here, and then much like I was saying with the previous example, test it on a completely independent set of data. If that works, then you're going to be rich.

Finally, don't blindly go hunting for statistical patterns. In much the same way that when you run a test over and over again and eventually the 1 in 20, the 1 in 100 chance comes in, if you just go looking for patterns anywhere in anything, then you're definitely going to find them. Human brains are really good at pattern recognition, and computers are even better. If you just start saying, does the average number of letters in the words I use affect my rankings, and you find a thousand variables like that, that are all completely arbitrary and there is no particular reason to think they would help your rankings, but you test enough of them, you will find some that look like they do, and you'll probably be wrong. You'll probably be misleading yourself. You'll probably look like an idiot in front of your boss. That's what this is all about, how not to look like an idiot.

I'm Will Critchlow. It's been a pleasure talking to you on Whiteboard Friday. I'm sure we'll be back to the usual scheduled programming next week. See you later.

Video transcription by Speechpad.com


Do you like this post? Yes No

Get on the Bus: Opening the Door to New Jobs

The White House Friday, August 19, 2011
 

This week President Obama traveled to the Midwest where he met with Americans in rural towns and communities in Minnesota, Iowa and Illinois. The purpose of his trip, dubbed the Economic Rural Tour 2011, was to talk to people from different walks of life about what is happening in our country right now.

The President was there to talk, but also to listen. In town halls, county fairs and an economic forum, Americans shared their hopes for the future and their concerns about the economy and what it means for their businesses and their families.

Throughout his trip, the President proposed a series of common sense steps Congress can take immediately upon their return to Washington that will start rebuilding our economy. These include:

  • Extend the payroll tax cut so that middle class families have more money in their paychecks next year.  If you've got more money in your paycheck, you're more likely to spend it, and that means businesses of all sizes will have more customers. They'll be in a better position to hire. 
  • Extend unemployment benefits so that millions of workers who are still pounding the pavement looking for jobs can support their families
  • Pass a bipartisan road construction bill. There are over a million construction workers out of work after the housing boom went bust, just as a lot of America needs rebuilding.  We can put these workers back to work by rebuilding our roads and bridges and railways. 
  • Pass the patent reform bill to help our innovators and entrepreneurs get their job creating ideas to market faster.
  • Pass the trade agreements that will help businesses sell more American-made goods and services to Asia and South America, supporting tens of thousands of jobs here at home.
  • Put our bright, talented and skilled veterans returning from Iraq and Afghanistan to work. The President has proposed several initiatives to make sure our veterans are able to navigate this difficult labor market and succeed in the civilian workforce, including the Returning Heroes and Wounded Warrior Tax Credits, and his challenge to the private sector to train or hire 100,000 unemployed veterans.

As President Obama said this week, “there is nothing wrong with our country, there is something wrong with our politics…When this country is operating off a common ground, nobody can stop us. But when we’re divided, then we end up having a whole lot of self-inflicted problems.”

Check out videos and highlights of all of the events from the Economic Rural Tour:

Monday August 15, 2011:

Townhall in Cannon Falls, MN

Townhall in Decorah, IA

Tuesday August 16, 2011:

White House Rural Economic Forum in Peosta, IA

Wednesday August 17, 2011:

Townhall in Atkinson, IL

Townhall in Alpha, IL

P.S. - On Tuesday, President Obama gave a few words in his closing remarks on how his bus trip through rural America reminded him of why he wanted to get into public service in the first place:

Sometimes there are days in Washington that will drive you crazy. But getting out of Washington and meeting all of you, and seeing how hard you're working, how creative you are, how resourceful you are, how determined you are, that just makes me that much more determined to serve you as best I can as President of the United States. 

Watch the full video of his remarks.



This email was sent to e0nstar1.blog@gmail.com
Manage Subscriptions for e0nstar1.blog@gmail.com
Sign Up for Updates from the White House
Unsubscribe e0nstar1.blog@gmail.com | Privacy Policy
Please do not reply to this email.
Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111