luni, 31 octombrie 2016

Seth's Blog : A dark chocolate sampler

Bean to bar dark chocolate is a revelation. It's got the terroir and backstory of the finest wines, it's a collision of rural farmers and modern technology and markets similar to coffee, and it also brings along the Proustian nostalgia...

A dark chocolate sampler

Bean to bar dark chocolate is a revelation. It's got the terroir and backstory of the finest wines, it's a collision of rural farmers and modern technology and markets similar to coffee, and it also brings along the Proustian nostalgia of childhood.

Too many of us have been stuck in a Nestle/Hershey universe for too long. And if your early collisions with dark chocolate aren't positive, it's easy to decide it's not worth the trouble or expense. 

[I get at least 10 servings out of a $10 bar, though, so it's hard for it to feel like a ridiculously expensive luxury. If you skip an espresso...]

Here, in no real order, are my favorite brands, all good to start with, all great to stick with. Every one is made by a human, who cares, someone you could meet, engage with and root for.

Askinosie
Rogue (entire production already sold out)
Original Beans
Cacao Hunters
Patric
Dick Taylor
Ritual
Soma
Fruition

A few simple understandings and principles:

The percentage matters, in the sense that chocolate that is between 70% and 90% dark is a Platonic ideal of flavor. I avoid flavoring agents like candies, seeds or salt, because what I'm trying to taste is the bean and what the maker has done to bring it to life.

The kind of bean matters. Forastero beans are cheap, easy to grow and not particularly worth seeking out (with a few exceptions). On the other hand, Criollo (particularly the wonderful rare Porcelana hybrid which you can find from Soma and Original Beans) is a party in your mouth--but, alas, the hardest to grow. It's always that way, isn't it? And Trinatario beans are the backbone of this hobby.

The country matters. Yes, with practice, it's actually easy to tell the difference between Madagascar and Colombia.

And finally, the farmer's relationship with the grower matters a lot. Askinosie imports their own beans, and does amazing work with the farmers who work so hard to grow them.

Enjoy. Halloween doesn't have to mean bad chocolate any more! And don't even get me started on candy corn.

       

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.



Email subscriptions powered by FeedBlitz, LLC, 365 Boston Post Rd, Suite 123, Sudbury, MA 01776, USA.

Seth's Blog : What does the poll say?

It says that people don't understand polls. Even smart marketers get it wrong. What do people think? There's a lot of confusion, much of it intentional, some spawned by a presumed fear of simple math, all of it worth clearing...

What does the poll say?

It says that people don't understand polls. Even smart marketers get it wrong.

What do people think? There's a lot of confusion, much of it intentional, some spawned by a presumed fear of simple math, all of it worth clearing up. 

A survey is not a poll is not a census. A census is what you get if you ask every single person what they think or who they are. There are only two reasons to have a census. Either you want each person to feel personally involved (hence an election) or you are keeping track of each person's answer. For example, if you're printing up t-shirts for the Frisbee team, you ought to do a census of the team to find out what size each person wants, then deliver each person the size they seek.

You could do a survey, which is merely a collection of answers from whomever cares enough to answer the survey. A survey is a useful tool for brainstorming, but it shouldn't be confused with what the group actually feels. Your lack of rigor in setting it up is repaid in a lack of precision in the data.

And a poll? A poll is a smart shortcut, a statistical method for replacing a census (asking everyone) with a very close approximation achieved by asking the minimum number of people required to get a useful answer. A properly done poll will get you an answer nearly as useful as an accurate census will, without the expense or the time.

It rarely makes sense to ask all of your customers about how they feel. You're wasting their time (and yours) by adding more entries into the database without those entries actually making the database any more accurate—part of the problem is that the only people who answer surveys are annoyed or have nothing better to do, and simply making a poll bigger doesn't make it better.

When big companies ask you to fill out a quick survey after talking to a customer service rep, they're not actually doing a survey. What they're doing is snooping on their customer service people, and your answers are directly connected back to each rep, so that person can be scolded (or worse) if they do a bad job.

A poll doesn't predict the future. The media has completely missed this point, again and again. If, on the day the iPhone was announced, you had done a well-designed poll of adults and asked, "Do you intend to ever buy a smartphone?" the yesses would have certainly been less than 5% of the result.

Of course, a decade later, that's turned out to be completely wrong. Was the poll in error?

No.

An accurate poll is a snapshot of right now, based on what's happening today. That's all. If outcomes end up being different a week or a year later, that's not the poll's fault, it's our mistaken belief that the future can be predicted.

To go one step further, the question that gets asked is as important as the answer. Try this at home: When you ask people a question, they rarely give you the straight up truth in their answer, especially when there are social factors at play. The very best polls combine not only the right math, but more important, the right question structure.

The magic of sample size. Let's say you had a bag of M&Ms. You know they come in six colors and you want to figure out the percentage of each that's in the bag. As long as the candies are distributed within the bag, it turns out that no matter how many are in the bag, whether it's a 2 pound bag or a 2,000 pound bag, all you need to do is randomly pull out 300 to 400 M&Ms. That's plenty. More samples won't dramatically increase the quality of this poll.

The purpose of the sample is to pick a random selection from a coherent group.

The key to this is understanding that sample size is relevant for any sized group that's consistent in its makeup. As soon as you can divide the group into buckets, you benefit by doing multiple samples.

Most of the well-done polls you hear about in public do not have a sample size problem. It's a red herring.

The power of bucketing. But what happens if you realize that there are more than one kind of M&Ms, and that different kinds have different color distributions? (This, it turns out, for mysterious reasons, is true. Almond M&Ms only come in five colors).

Well, you could take this into account and run much bigger sample groups, or you could get smart about sample size.

It turns out, for example, that women who ride Harley Davidson motorcycles want different things from them than men do. It also turns out that (I'm guessing about all the Harley stats here) perhaps 10% of the people who buy a Harley are women.

Given that, you could poll 300 women (the easy minimum) and then 2700 men (so you get the balance right). OR, you could get smart, and poll 300 women and 300 men (because every time you add a new person, it's really expensive). "But wait," you might say, "that's not right, because women are overrepresented."

So far, that's true. But after you figure out how women think, and then figure out how men think, you can weight the men's results in your final tally. If, for example, you discovered that women intend to buy a new Harley every two years, but men intend to buy one every six years, you could then report back that the average customer intends to buy a new Harley every five and a half years or so.  (Said with full knowledge that it's dangerous to average averages, but in this case, it's correct.)

Confusion about polls is easy. And the more we try to make decisions using polls, the more careful we need to be about the structure and motivation of the poll itself. 

But finding an accurate poll is pretty easy as well. Most pollsters (in private and in public work) are transparent about their methods, and the magic of statistics is that the math of how the poll is structured can be checked by others. 

Too often, marketers do surveys, not polls, or bother everyone with a census, poorly done. Worse, they then use these results as an accurate prediction of the future, instead of a reliable snapshot of now.

It's the surveys that are so often wrong, deceptive and confusing. It's surveys ("no one I know believes that") that feel like they're accurate but rarely are.

And if we're going to challenge a poll, far smarter to challenge the questions ("that's designed to get the respondent to lie") or the flaws in sampling ("this requires all polled individuals to have a home phone, but of course, an entire generation of young people don't have one.")

But it makes no sense at all to throw out the results of polls we disagree with. The quality of the cars we drive, the efficacy of the medicines we take are all directly related to the very same statistical techniques that we use to run a poll. Ask the right questions to the right people and your snapshot is going to be helpful.

If you want to, be wary of polls. But be wary for the right reasons.

       

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.



Email subscriptions powered by FeedBlitz, LLC, 365 Boston Post Rd, Suite 123, Sudbury, MA 01776, USA.