Here's a quick (hopefully) post about collecting data via surveys.
One of the easiest things to mess-up when preparing a survey is to ignore the science of selecting the audience. You can't avoid beginning a survey with a certain bias and most times that is a good thing. For example, a knitters magazine sends a survey to their readers to better understand why people subscribe. That audience is designed to be targeted to people familiar with the product and the questions can follow that original assumption. When using their collected data and making assumptions, the audience needs to be stated first and foremost. In this case:
Good: "60% of our readers buy yarn more than once a month."
Incomplete: "60% of people surveyed buy yarn more than once a month."
Taken out of context, this statement is misleading.
Now, setting the obvious aside, consider the slightly more subtle. I recently took a survey posted by a Paper Industry newsletter (I know, nerdy). Their audience (and responses) will be skewed to some degree by the nature of the delivery of the invite. Mostly, it's perfect because it's at the end of the very newsletter for which they are collecting the data. However, if a 3rd party paper mill wanted to use this industry newsletter to invite people to take a general survey about paper buying habits, they would have started with a very narrow audience compared to their overall universe of potential buyers. Not bad data, just skewed from the outset. Further, if a book seller wanted to use this newsletter to invite people to a survey about their reading habits, the responses could be so skewed as to be unusable. Get it?
Here's what triggered this post. At the end of the survey, Survey Monkey asked if I wanted to take more surveys for a chance to win $100 and to help a charity of my choice. If I had the time and inclination, I could signup to be part of a very skewed and potentially misleading type of audience.
Professional survey takers are likely going to be at least one of the following:
- ignorant of the industry they claimed to know. If I want to take a lot of surveys, I'd better claim a lot of interests.
- not really interested in the questions, just "burning" through the survey to get to the entry for $100
- the type of person (Opportunity Seeker?) that is willing to spend their time on the chance they will win something. Without casting aspersions, you can imagine situations where this type of person could be contrary to the intended audience.
- other influences?
And speaking of incentives ...
It's hard to get people to take your survey. Adding an incentive is a good way to increase response. Also, people many times will spend more time with the survey if they feel like they are being compensated for their time. It's kind of a feeling of give and take. The incentive is probably better if it's smallish, just enough to say you value the takers time but not so much to attract people outside your target audience. The best incentive is something in line with your product/survey to further qualify your responders. EG, if Pampers offered a chance to win a trip to Maui, I might take their survey even though I don't have kids. If Starbucks offered a free coffee as an incentive, only coffee drinkers would be incentivized.
All this should seem like common sense when you think about it but given the number of surveys I see come across my computer screen, it feels like it needs repeating. When designing a survey, really study the whole process from start to finish. What do we want to know? Who do we want to target? Who is likely to respond? How do we pose the questions in the most unbiased way? And when finished, include the demographics and inferred habits of the audience and responders to make the most out of the data collected.
To quote David Byrne:
Facts are simple and facts are straight
Facts are lazy and facts are late
Facts all come with points of view
Facts don't do what I want them to
Facts just twist the truth around
Facts are living turned inside out
- from Crosseyed and Painless by the Talking Heads