Survey questions
Is this a survey or a census? A survey is statistically based, extracting insight from a few and being able to assert its truth across a wider population. A census involves asking everyone, and usually, matching up the answers with the person so you can take further action.
If it's a survey, you probably don't need to reach as many people as you think you do. And if it's a survey, you are almost certainly going to get skewed answers, because surveying the people who answer surveys is truly different from surveying a statistically valid sample of your audience. SurveyMonkey doesn't actually run surveys of your total audience. It runs a poll of people who are willing to answer the questions.
It's pretty easy to survey everyone, ask every customer a question on checkout. In fact, online, it's easier to run something more like a census than a survey, because you merely turn it on and let it run. This is not a smart way to get a statistically accurate insight, but worse, if you run a census, you're wasting an opportunity if you treat it like a survey. If you ask every customer a question, you better be prepared to follow up on every customer who's not happy.
Are you looking for correlations? Causation is almost impossible to find in a survey. But if you're smart, you can learn a lot if you're able to determine that people who said "B" in answer to question 3 are also likely to believe "E" in answer to question 6. This is a huge step in your ability to determine worldviews and to ultimate treat different people differently.
It doesn't matter if 40% of your customers believe something about price and 39% believe something about features, but if you discover that 98% of the customers who believe this about price also believe that about quality, you just found something useful.
Is this worth my customer's time? It's super easy to commission a survey. Pay your money and you're done. But then what? Fedex sent Ipsos after me and thousands of other people by phone, wasted more than ten minutes of my time with a survey that never ended, and then they never followed up. Those ten minutes cost Fedex a huge amount of trust and goodwill.
Asking someone to answer a survey has a very real cost. Is the survey worth it?
Are you asking questions capable of making change happen? After the survey is over, can you say to the bosses, "83% of our customer base agrees with answer A, which means we should change our policy on this issue."
It feels like it's cheap to add one more question, easy to make the question a bit banal, simple to cover one more issue. But, if the answers aren't going to make a difference internally, what is the question for?
Are you push polling? The questions you ask actually end up changing the person who is responding. Ask me if I'm unhappy and I'm a lot more likely to become unhappy. Ask me who my favorite customer service person is and I'm more likely to look for good customer service people.
This is a challenge that most census-structured customer service surveys have to deal with. If you ask someone if they're satisfied and then don't follow up later, you've just made the problem a lot worse. If you ask your best customers for insight and then ignore it, you've not only wasted the insight, you've wasted goodwill as well.
Here's a simple test I do, something that has never once led to action: In the last question of a sloppy, census-style customer service survey, when they ask, "anything else?" I put my name and phone number and ask them to call me. They haven't, never once, not in more than fifty brand experiences.
If you're not going to read the answers and take action, why are you asking?
Best question to ask about a survey: Do we actually have to run this?
More Recent Articles
[You're getting this note because you subscribed to Seth Godin's blog.]
Don't want to get this email anymore? Click the link below to unsubscribe.
Click here to view mailing archives, here to change your preferences, or here to subscribe • Privacy
Niciun comentariu:
Trimiteți un comentariu