luni, 4 august 2014

Your website on Google (now with social data and reports)

SubmitStart Mailing. Unsubscribe from this list.


Get high rankings on Google
Your competitors use SEOprofiler to get better rankings.
See what you've been missing out on:



Everything you need to get top rankings on Google
SEOprofiler is the proven solution that helps you to get your website on top of Google's search results. It offers everything from link building over keyword research to website analytics ( + a lot more).


As easy as 1-2-3: track your results
Find the keywords that deliver the best results. See how your website is ranked for your keywords on Google, Bing and Yahoo in 68 countries. Integrates with Google Analytics.


Impress your boss and your clients
Create beautiful reports in your company design for your boss and your clients. Fully customizable with your own logo, colors, etc.


Spy on your competitors
Our powerful competitive intelligence tools enable you to spy on the backlinks, Google AdWords ads and Google rankings of your competitors. Find harmful links that point to your website and eliminate these bad links.


Trusted by 50,000 companies
More than 50,000 businesses use SEOprofiler to get high rankings on Google. If you haven't tested SEOprofiler yet, create a free trial account or order buy SEOprofiler risk-free.

A complete suite of tools helps your competitors to get high rankings.
Sent to e0nstar1.blog@gmail.comwhy did I get this?

unsubscribe from this list | update subscription preferences

SubmitStart · Trade Center · Kristian IV:s väg 3 · Halmstad 302 50 · Sweden

Mish's Global Economic Trend Analysis

Mish's Global Economic Trend Analysis


How About Them Apples?

Posted: 04 Aug 2014 06:47 PM PDT

On July 30, Moscow blocks Polish fruit, veg imports, mulls EU ban citing "systematic violations of international and Russian phytosanitary requirements".

Everyone understands this was retaliation for further EU sanctions on Russia.
Russia has slapped a temporary ban on fruit and vegetable imports from Poland, claiming the products breach its standards.

Rosselkhoznador, the country's federal veterinary and phytosanitary control agency, issued a statement yesterday (30 July) saying it is to introduce a ban on several Polish fruit and vegetable products after it discovered "systematic violations of international and Russian phytosanitary requirements".

"Rosselkhoznadzor considers it necessary to introduce from 1 August 2014 as a temporary emergency phytosanitary measures restrictions on imports to Russia from Poland and Polish imports through third countries," the Russian food safety body said.

Items affected include apples, pears and quince, apricot, cherries, plus all vegetables except mushrooms.

In an interview with Reuters, a spokesperson for Rosselkhoznador said the move "was part of a VPSS plan to consider restricting all or some fruit imports from the entire EU". However he denied the restrictions stemmed from the EU sanctions.

Bloomberg had reported Russia was also mulling the ban of chicken from the US, which has joined the EU in imposing sanctions on parts of the Russian economy.
Revenge

Reuters reports Russian ban on Polish produce is revenge for EU sanctions
Moscow, which buys more than 2 billion euros worth of EU fruit and vegetables a year making it by far the biggest export market for the produce, said the ban was for sanitary reasons and denied a link to the sanctions.

Moscow has frequently been accused in the past of using food safety inspections to restrict trade from countries with which it has political disputes. The EU said it was studying the announcement, describing it as a surprise.

"The embargo amounts to political repression in response to the sanctions imposed by the European Union against Russia," Poland's agriculture ministry said in a statement.

According to European Commission figures, the EU sold Russia 1.2 billion euros worth of fruit and 886 million euros worth of vegetables in 2011, accounting for 28 percent of the bloc's exports of fruit and 21.5 percent of its vegetables. For some EU countries, including Poland, the percentages are even higher.

Poland is the largest exporter of apples in the world. In 2013 it exported apples worth 438 million euro ($587 million), of which 56 percent went to Russia, according to Poland's Ministry of Agriculture.

"I'm expecting the Polish apple producers to suffer," Witold Boguta, representing Poland's Association of Fruit and Vegetable Producers, told Reuters.
Surprise?

If EU bureaucrats really were surprised by this, they are stupider than I thought, which is saying quite a lot.

Why anyone should be surprised by this is a mystery. Retaliation should have been widely expected.

Poland Mocks Russia's Ban on Polish Fruit

In response to the ban, Poland Mocks Russia With Eat More Apples Campaign.
The produce ban is expected to affect Polish apples more than any other product. Poland is Europe's largest producer of apples, with more than half of its production going to Russia.

The "Puls Biznesu" newspaper called on Wednesday for a show of support for Poland's apple producers, urging people to eat more apples and to drink cider. Poles responded with humorous posts on Twitter under the hashtag #jedzjablka – Polish for "eat apples".

One Twitter user predicted that half of Warsaw would get drunk on cider over the weekend.

"An apple a day keeps Putin away!" wrote another Twitter user, in a reference to the Russian president.

Poland is only the latest in a series of countries that Russia has targeted with import bans. Russia announced on Thursday that it would ban the import of soy products, cornmeal and sunflowers from Ukraine. The move comes following bans on Ukrainian dairy products and canned foods that were imposed in recent days.

Russia has a history of banning imports from the countries it is in disputes with, usually citing safety concerns or violations. Last year it blocked the import of Ukrainian chocolates made by the company owned by candy magnate Petro Poroshenko, a pro-Western politician who is now Ukraine's president.

Earlier this month Russia blocked the import of Moldovan fruit after the country signed an association agreement with the EU. And it banned shipments of Georgian wine and mineral water just before the 2008 war with Georgia over South Ossetia.
Eat Apples  

Poles may get drunk on cider for a week or two while eating more apples than usual. Then what?

The Population of Russia is about 146 million. The Population of Poland is about 37 million. Polish will have to eat about 4 times as may apples per person as they used to.

Assuming that happens (which it probably won't beyond one week at most), at what price? Poland is going to have a lot of apples it will not know what to do with.

The Ukrainian economy is in ruins over the war and the collapse in trade with Russia as the cry from President Obama and Senator John McCain for for more sanctions on Russia grows.

Sanctions are not not very bright.

No one wins in a trade war. And Europe is about to find out in a big way.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com

Browser Wars: Google Chrome Passes Firefox With 20% Share; Mish Chrome Test Run

Posted: 04 Aug 2014 01:47 PM PDT

I have been a Mozilla Firefox user for what seems like forever. I never liked Microsoft's Internet Explorer browser.

Lately, Firefox has been quite irritating, especially when I have a large number of windows open. Firefox frequently crashes, then every page goes down. This has happened before at times, but crashes are even more frequent now.

Also Firefox frequently locks up, and Adobe Flash is the culprit. This problem also seems to have gotten worse. To fix the lock-up problem, I open up Task Manager and kill adobe flash player. My Firefox pages then instantly free up.

Chrome Passes Firefox With 20% Share

Today I read, Chrome Passes 20% Share Milestone, Locks Up 2nd Place.
Computerworld - Google's Chrome browser in July broke the 20% user share bar for the first time, according to data published Friday by Web measurement vendor Net Applications.

But because the browser war is a zero-sum game, when Chrome won others had to lose. The biggest loser, as has been the case for the last year: Mozilla's Firefox, which came dangerously close to another milestone, but on the way down.

Firefox accounted for 15.1% of the desktop and laptop personal computer browsers used in July, a low point not seen by the open-source application since October 2007, a year before Chrome debuted and when Microsoft's Internet Explorer (IE) was only on version 7.

Chrome's July user share of 20.4% put the browser solidly in second place, but still far behind IE in Net Applications' tallies. IE's share last month was 58%, down slightly from the month before.

Firefox also lost user share in July, dropping half a percentage point to 15.1%. It was the ninth straight month that the desktop browser lost share. In the past three months alone, Firefox has fallen nearly two points.

The timing of the decline has been terrible, as Mozilla's current contract with Google ends in November. That deal, which assigned Google's search engine as the default for most Firefox customers, has generated the bulk of Mozilla's revenue. In 2012, for example, the last year for which financial data was available, Google paid Mozilla an estimated $272 million, or 88% of all Mozilla income.

Going into this year's contract renewal talks, Mozilla will be bargaining from a much weaker position, down 34% in total user share since July 2011.

Browser Wars



Mish Chrome Test Run

After reading the above article, I decided to give Chrome a spin.  Chrome imported my tab favorites from Firefox flawlessly.

Initial Appearance Different

The appearance on my blog looked different in each of IE, Firefox, and Chrome. It looked worst, by far, in Chrome. I could not get the fonts and text sizes to match.

The solution to that problem was to modify font-families specified on my blog.

I went with a simpler scheme of "font-family: Arial, Helvetica, sans-serif;" across the board after reading Which Font Should I Use for My Web Page?.

That scheme may not be the best, but it is likely to be the most consistent across all browsers.

Helvetica

Curiously, when I was attempting to fix the display issue with settings rather than in blog code, I noticed Helvetica, a popular font is not even in the selection list.
.

Each Window a Different Task

After the appearance issues were fixed, I liked what I saw. Task manager shows that each open Chrome window is its own task.



If a page crashes (I purposely crashed a Chrome page in task manager), you get a response that looks like this.



Firefox Crashes and Memory Leaks

In Chrome, if one page crashes they all don't crash. I setup Firefox that way at one time, but the plugin container used an enormous amount of memory when I tried it, and I had to switch back.

Other users still report Firefox Crashes for various reasons.

I do believe Firefox has a memory leak of some sort. Memory use goes up and does not fully recover even if you start closing pages.

In Chrome, unlike Firefox, Google reports "Adobe Flash Player is directly integrated with Google Chrome and enabled by default". Hopefully, this will prevent the freeze-ups I experienced with flash in Firefox.

Translation, Settings, Other Features

Chrome provides built-in translation, a feature that will come in very handy for me. I frequently translate pages from Spanish or German, and now Russian and Ukrainian as well. The process was very cumbersome before. Now, it's one click.

Also, Google Chrome allows you to pick up settings and sessions from one computer to another. This is very handy for me, although some will object to Google storing all the information required to accomplish that task.

Anyway, I like what I see so far. If I run into no Chrome issues, it will be goodbye to Firefox for me.

It appears others may be making the same choice.

If you wish to give it a try, here is the Google Chrome Download Link

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com

483 Ukrainian Military Tired of War, Seek Asylum in Russia

Posted: 04 Aug 2014 10:08 AM PDT

Back in April, thousands of Ukrainian soldiers defected to pro-Russian side.

It's been a while since we have seen reports like that. Today we have another report: 483 Ukrainian Military Seek Asylum in Russia.
The spokesman for the Border Guard Service of Russia, Vasili Maláyev informed that during the night of Sunday, "about 438 Ukrainian military approached the Russian border guards to seek asylum. According to the decision of the Border Guard Service of Russia , officials opened a humanitarian corridor and allowed into Russia to those who need shelter."

Ukrainian military belonging to the 72nd Mechanized Brigade of the Armed Forces of Ukraine. "They said they were tired of war and are no longer willing to fight," Maláyev reported.

Before leaving Ukraine, they destroyed their weapons and ammunition depots.

In recent weeks such incidents have become increasingly common. In recent days, military Ukrainians marched through the neutral zone of the border between Russia and Ukraine unarmed and with a white flag.

According to Maláyev, the military, both officers and soldiers, also belonged to the 72nd Mechanized Brigade. Ukrainian military explained that they had run out of supplies and ammunition that were available were not suitable for fire systems
That is a rather curious source, and the English on the site was broken. I fixed a couple of spots.

The only other non-blog news media reference I could find was from RIA: Over 400 Ukrainian Military Personnel Request Refugee Status in Russia. Of course, Western mainstream media has no interest in reporting such things.

Here are some additional details from RIA.
"Overnight 438 Ukrainian military personnel turned to Russian border guards with a request for refugee [status]," the head of the FSB's border control in the southern Russian region of Rostov, Vasily Malaeyev, said.

Border control authorities have opened a humanitarian corridor and have allowed refugees into Russia. Among the 438 personnel, 164 are employees of Ukraine's State Border Service.

On Sunday, 12 soldiers from the Ukrainian Armed Forces made it into Russia and applied for an asylum at Gukovo checkpoint in Russia's Rostov Region, saying they had run out of food and ammunition.

Last month, another 40 Ukrainian troops abandoned their military units and asked independence supporters to allow them to come to Russia in order not to fight against their own people.
Ukraine's 72nd and 79th Brigades Pounded

Only the first link made reference to the 72nd Mechanized Brigade, a claim that seems highly credible.

For my July 26 reference to the entrapment of of the 72nd brigade, please see Who's Winning the War in Ukraine? Answer May Shock You!

For a video on the demise of part of the 79th brigade, please see Ukraine's Army Advances; Unguided Rockets Kill Civilians; Demise of Rebels?

Lost Territory

Nonetheless, the rebels have lost half the territory they once held, some in scorched earth policies of the Ukrainian army, with no regard to civilians.

If the rebels lose much more territory, it will be over.

Yet, my sources tell me the rebels are ordering Winter supplies in assumption the war will last quite some time.

Ukrainian Army Stretched to Limit?

For yet another piece of the puzzle, one that possibly indicates the Ukrainian army is stretched a bit too far, please consider Ukrainians Ordered to War, Women Burn the Military Writs

Regardless of who "wins" militarily, the scars will take years, if not far longer to heal, and Ukraine will be beholden to the IMF and other creditors for decades.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com

50 African Leaders in the Nation's Capital

 
Here's what's going on at the White House today.
 
 
 
 
 
  Featured

50 African Leaders in the Nation's Capital

This week, President Obama will welcome 50 African leaders to the nation's capital, as part of the U.S.-Africa Leaders Summit. The three-day summit is the largest event that any U.S. president has held with African heads of state, and will build on the President's trip to Africa in 2013.

National Security Advisor Susan Rice sat down to preview the summit, and explain what this means for both the United States as well as for the African nations attending this historic event.

Watch Susan Rice preview the summit, and find out more about this historic event.

A preview of the U.S.-Africa Leaders Summit


 
 
  Top Stories

Weekly Address: It's Time for Congress to Help the Middle Class

In this week's address, the President discussed the new monthly jobs report and the fact that our economy created over 200,000 new jobs in July for the sixth straight month -- the longest streak since 1997. To ensure this momentum can be sustained, the President is pressing Congress to act to create jobs and expand opportunity from raising the minimum wage, to helping people pay back their student loans, to fair pay and paid leave.

READ MORE

Highlights from Dr. Biden's Trip to Africa

Last month, Dr. Jill Biden traveled to Africa on a three-country visit to Zambia, the Democratic Republic of the Congo, and Sierra Leone, to highlight how girls' education and women's participation in government and civil society can foster economic growth and strengthen government institutions.

READ MORE

Digital Briefing: An Update from Ben Rhodes on Ukraine

Deputy National Security Advisor Ben Rhodes provides an update on recent U.S. actions with regard to Ukraine and offers an overview of America's policy position.

READ MORE


 
 
  Today's Schedule

All times are Eastern Time (ET)

11:00 AM: The President and Vice President receive the Presidential Daily Briefing

11:45 AM: The President meets with Secretary of the Treasury Lew

12:00 PM: The Vice President delivers remarks at the U.S.-Africa Leaders Summit Civil Society Forum

12:45 PM: Press Briefing by Press Secretary Josh Earnest


 

Did Someone Forward This to You? Sign Up for Email Updates

This email was sent to e0nstar1.blog@gmail.com

Unsubscribe | Privacy Policy
Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111


CRO Statistics: How to Avoid Reporting Bad Data

CRO Statistics: How to Avoid Reporting Bad Data


CRO Statistics: How to Avoid Reporting Bad Data

Posted: 03 Aug 2014 05:15 PM PDT

Posted by CraigBradford

Without a basic understanding of statistics, you can often present misleading results to your clients or superiors. This can lead to underwhelming results when you roll out new versions of a page which on paper look like they should perform much better. In this post I want to cover the main aspects of planning, monitoring and interpreting CRO results so that when you do roll out new versions of pages, the results are much closer to what you would expect. I've also got a free tool to give away at the end, which does most of this for you.

Planning

A large part running a successful conversion optimisation campaign starts before a single visitor reaches the site. Before starting a CRO test it's important to have:

  1. A hypothesis of what you expect to happen
  2. An estimate of how long the test should take
  3. Analytics set up correctly so that you can measure the effect of the change accurately

Assuming you have a hypothesis, let's look at predicting how long a test should take.

How long will it take?

As a general rule, the less traffic that your site gets and/or the lower the existing conversion rate, the longer it will take to get statistically significant results. There's a great tool by Evan Miller that I recommend using before starting any CRO project. Entering the baseline conversion rate and the minimum detectable effect (i.e. What is the minimum percentage change in conversion rate that you care about, 2%? 5%? 20%?) you can get an estimate of how much traffic you'll need to send to each version. Working backwards from the traffic your site normally gets, you can estimate how long your test is likely to take. When you arrive on the site, you'll see the following defaults:

Notice the setting that allows you to swap between 'absolute' and 'relative'. Toggling between them will help you understand the difference, but as a general rule, people tend to speak about conversion rate increases in relative terms. For example:

Using a baseline conversion rate of 20%

  • With a 5% absolute improvement - the new conversion rate would be 25%
  • With a 5% relative improvement - the new conversion would be 21%

There's a huge difference in the sample size needed to detect any change as well. In the absolute example above, 1,030 visits are needed to each branch. If you're running two test versions against the original, that looks like this:

  • Original - 1,030
  • Version A - 1,030
  • Version B - 1,030

Total 3,090 visits needed.

If you change that to relative, that drastically changes: 25,255 visits are needed for each version. A total of 75,765 visits.

If your site only gets 1,000 visits per month and you have a baseline conversion rate of 20%, it's going to take you 6 years to detect a significant relative increase in conversion rate of 5% compared to only around 3 months for an absolute change of the same size.

This is why the question of whether or not small sites can do CRO often comes up. The answer is yes, they can, but you'll want to aim higher than a 5% relative increase in conversions. For example, If you aim for a 35% relative increase (with 20% baseline conversion), you'll only need 530 visits to each version. In summary, go big if you're a small site. Don't test small changes like button changes, test complete new landing pages, otherwise it's going to take you a very long time to get significantly better results.

Analytics

A critical part of understanding your test results is having appropriate tracking in place. At Distilled we use Optimizely so that's what I'll cover today; fortunately Optimizely makes testing and tracking really easy. All you need is a Google analytics account that has a custom variable (custom dimension in universal analytics) slot free. For either Classic or Universal Analytics, begin by going to the Optimizely Editor, then clicking Options > Analytics Integration. Select enable and enter the custom variable slot that you want to use, that's it. For more details, see the help section on the Optimizely website here.

With Google analytics tracking enabled, now when you go to the appropriate custom variable slot in Google Analytics, you should see a custom variable named after the experiment name. In the example below the client was using custom variable slot 5:

This is a crucial step. While you can get by by just using Optimizely goals like setting a thankyou page as a conversion, it doesn't give you the full picture. As well as measuring conversions, you'll also want to measure behavioral metrics. Using analytics allows you to measure not only conversions, but other metrics like average order value, bounce rates, time on site, secondary conversions etc.

Measuring interaction

Another thing that's easy to measure with Optimizely is interactions on the page, things like clicking buttons. Even if you don't have event tracking set up in Google Analytics, you can still measure changes in how people interact with the site. It's not as simple as it looks though. If you try and track an element in the new version of a page, you'll get an error message saying that no items are being tracked. See the example from Optimizely below:

Ignore this message, as long as you've highlighted the correct button before selecting track clicks, the tracking should work just fine. See the help section on Optimizely for more details.

Interpreting results

Once you have a test up and running, you should start to see results in Google Analytics as well as Optimizely. At this point, there's a few things to understand before you get too disappointed or excited.

Understanding statistical significance

If you're using Google analytics for conversion rates, you'll need something to tell you whether or not your results are statistically significant - I like this tool by Kiss Metrics which looks like this:

It's easy to look at the above and celebrate your 18% increase in conversions - however you'd be wrong. It's easier to explain what this means with an example. Let's imagine you have a pair of dice that we know are exactly the same. If you were to roll each die 100 times, you would expect to see each of the numbers 1-6 the same number of times on both die (which works out at around 17 times per side). Let's say on this occasion though we are trying to see how good each die is at rolling a 6. Look at the results below:

  • Die A - 17/100 = 0.17 conversion rate
  • Die B - 30/100 = 0.30 conversion rate

A simplistic way to think about Statistical significance is it's the chance that getting more 6s on the second die was just a fluke and that it hasn't been optimised in some way to roll 6s.

This makes sense when we think about it. Given that out of 100 rolls we expect to roll a 6 around 17 times, if the second time we rolled a 6 19/100 times, we could believe that we just got lucky. But if we rolled a 6 30/100 times (76% more), we would find it hard to believe that we just got lucky and the second die wasn't actually a loaded die. If you were to put these numbers into a statistical significance tool (2 sided t-test), it would say that B performed better than A by 76% with 97% significance.

In statistics, statistical significance is the complement of the P value. The P value in this case is 3% and the complement therefore being 97% (100-3 = 97). This means there's a 3% chance that we'd see results this extreme if the die are identical.

When we see statistical significance in tools like Optimizely, they have just taken the complement of the P-value (100-3 = 97%) and displayed it as the chance to beat baseline. In the example above, we would see a chance to beat baseline of 97%. Notice that I didn't say there's a 97% chance of B being 76% better - it's just that on this occasion the difference was 76% better.

This means that if we were to throw each dice 100 times again, we're 97% sure we would see noticeable differences again, which may or may not be by as much as 76%. So, with that in mind here is what we can accurately say about the dice experiment:

  • There's a 97% chance that die B is different to die A

Here's what we cannot say:

  • There's a 97% chance that die B will perform 76% better than die A

This still leaves us with the question of what we can expect to happen if we roll version B out. To do this we need to use confidence intervals.

Confidence intervals

Confidence intervals help give us an estimate of how likely a change in a certain range is. To continue with the dice example, we saw an increase in conversions by 76%. Calculating confidence intervals allow us to say things like:

  • We're 90% sure B will increase the number of 6s you roll by between 19% to 133%
  • We're 99% sure B will increase the number of 6s you roll by between -13% to 166%

Note: These are relative ranges. That being -13% less than 17% and 166% greater than 17%.

The three questions you might be asking at this point are:

  1. Why is the range so large?
  2. Why is there a chance it could go negative?
  3. How likely is the difference to be on the negative side of the range?

The only way we can reduce the range of the confidence intervals is by collecting more data. To decrease the chance of the difference being less than 0 (we don't want to roll out a version that performs worse than the original) we need to roll the dice more times. Assuming the same conversion rate of A (0.17%) and B (0.3%) - look at the difference increasing the sample size makes on the range of the confidence intervals.

As you can see, with a sample size of 100 we have a 99% confidence range of -13% to 166%. If we kept rolling the dice until we had a sample size of 10,000 the 99% confidence range looks much better, it's now between 67% better and 85% better.

The point of showing this is to show that even if you have a statistically significant result, it's often wise to keep the test running until you have tighter confidence intervals. At the very least I don't like to present results until the lower limit of the 90% interval is greater than or equal to 0.

Calculating average order value

Sometimes conversion rate on its own doesn't matter. If you make a change that makes 10% fewer people buy, but those that do buy spend 10x more money, then the net effect is still positive.

To track this we need to be able to see the average order value of the control compared to the test value. If you've set up Google analytics integration like I showed previously, this is very easy to do.

If you go into Google analytics, select the custom variable tab, then select the e-commerce view, you'll see something like:

  • Version A 1000 visits - 10 conversions - Average order value $50
  • Version B 1000 visits - 10 conversions - Average order value $100

It's great that people who saw version B appear to spend twice as much, but how do we know if we just got lucky? To do that we need to do some more work. Luckily, there's a tool that makes this very easy and again this is made by Evan Miller: Two sample t-test tool.

To find out if the change in average order value is significant, we need a list of all the transaction amounts for version A and version B. The steps to do that are below:

1 - Create an advanced segment for version A and version B using the custom variable values.

2 - Individually apply the two segments you've just created, go to the transactions report under e-commerce and download all transaction data to a CSV.

3 - Dump data into the two-sample t-test tool

The tool doesn't accept special characters like $ or £ so remember to remove those before pasting into the tool. As you can see in the image below, I have version A data in the sample 1 area and the transaction values for version B in the sample 2 area. The output can be seen in the image below:

Whether or not the difference is significant is shown below the graphs. In this case the verdict was that sample 1 was in fact significantly different. To find out the difference, look at the "d" value where is says "difference of means". In the example above the transactions of those people that saw the test version were on average $19 more than those that saw the original.

A free tool for reading this far

If you run a lot of CRO tests you'll find yourself using the above tools a lot. While they are all great tools, I like to have these in one place. One of my colleagues Tom Capper built a spreadsheet which does all of the above very quickly. There's 2 sheets, conversion rate and average order value. The only data you need to enter in the conversion rate sheet is conversions and sessions, and in the AOV sheet just paste in the transaction values for both data sets. The conversion rate sheet calculates:

  1. Conversion rate
  2. Percentage change
  3. Statistical significance (one sided and two sided)
  4. 90,95 and 99% confidence intervals (Relative and absolute)

There's an extra field that I've found really helpful (working agency side) that's called "Chance of <=0 uplift".

If like the example above, you present results that have a potential negative lower range of a confidence interval:

  • We're 90% sure B will increase the number of 6s you roll by between 19% and 133%
  • We're 99% sure B will increase the number of 6s you roll by between -13% and 166%

The logical question a client is going to ask is: "What chance is there of the result being negative?"

That's what this extra field calculates. It gives us the chance of rolling out the new version of a test and the difference being less than or equal to 0%. For the data above, the 99% confidence interval was -13% to +166%. The fact that the lower limit of the range is negative doesn't look great, but using this calculation, the chance of the difference being <=0% is only 1.41%. Given the potential upside, most clients would agree that this is a chance worth taking.

You can download the spreadsheet here: Statistical Significance.xls

Feel free to say thanks to Tom on Twitter.

This is an internal tool so if it breaks, please don't send Tom (or me) requests to fix/upgrade or change.

If you want to speed this process up even more, I recommend transferring this spreadsheet into Google docs and using the Google Analytics API to do it automatically. Here's a good post on how you can do that.

I hope you've found this useful and if you have any questions or suggestions please leave a comment.

If you want to learn more about the numbers behind this spreadsheet and statistics in general, some blog posts I'd recommend reading are:

Why your CRO tests fail

How not to run an A/B test

Scientific method: Statistical errors


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Seth's Blog : Is authenticity authentic?

 

Is authenticity authentic?

Perhaps the only truly authentic version of you is just a few days old, lying in a crib, pooping in your pants.

Ever since then, there's been a cultural overlay, a series of choices, strategies from you and others about what it takes to succeed in this world (in your world).

And so it's all invented.

When you tell me that it would be authentic for you to do x, y or z, my first reaction is that nothing you do is truly authentic, it's all part of a long-term strategy for how you'll make an impact in the world.

I'll grant you that it's essential to be consistent, that people can tell when you shift your story and your work in response to whatever is happening around you, and particularly when you say whatever you need to say to get through the next cycle. But consistency is easier to talk about and measure than authenticity is.

The question, then, is what's the impact you seek to make, what are the changes you are working for? And how can you achieve that and still do work you're proud of?

       

 

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.




Email subscriptions powered by FeedBlitz, LLC, 365 Boston Post Rd, Suite 123, Sudbury, MA 01776, USA.