How to read survey data

As it gets closer to our (American) mid-term elections, we’re about to be inundated with surveys and polls. But, even between elections, surveys are everywhere, for better or worse.

To help filter the signals from the noise, here is my list of tips for critically reading reports based on survey data that I’ve collected over the years.

If you’re a reader of survey data, use these tips to help you interpret survey data you see in the future.

If you’re publishing survey data, be sure to consider these as well, especially if your readers have read this post.

To critically read survey data, you need to know:

  1. Who was surveyed and how
  2. What they were asked
  3. How the results are stated

Let’s look at  each of these a bit more…

1. Who was surveyed?

Start with the survey respondents and the population they are intended to represent (if any).

  • Who were they?
  • Who are they supposed to represent (and do they)?
  • How were they selected or invited to participate?
  • Do they represent the group being reported about?

If you can’t answer these questions, then walk away. Such a “survey” is just an opinion piece written with charts and percentages. If this information isn’t reported, you, as the reader, have no way to know if the data means what it is being reported to mean.

Now that you know what to look for, watch for this information. Reputable surveys will report this (and it only takes a sentence). When this is not reported,  your first question should be, “why not?”

A representative survey will select participants at random from the group on which the survey is reporting. This is hard and this is expensive, but it results in good data and good data takes some work.

A more common and much less expensive approach is a convenience sample, which, as the name implies, is convenient. Such a sample can be taken from people who read a blog, have bought a product, gone to a conference, are friends on Facebook, etc. In most cases these groups do not represent any group outside of those who responded.

That being said, a non-representative survey can still be meaningful, but you can’t know how it relates to a larger population (nor can the reporter attribute the results to a larger population). You can only know that the results of a non-representative survey represent only those who responded.

I underlined that last sentence for you. It is important and I’ll be coming back to it.

2. What were they actually asked?

This is another thing you’ll see reported in credible surveys: The actual question that people were asked. This helps reduce the interpretation variance where people were asked one question but it reported as something different. Sometimes you have to ask a question in a way that differs from how you want to report it, for example, if the subject population uses different terms or language, but it’s still helpful to show the question asked.

3. How do they state the results?

This is where the rubber meets the road, or the guardrail and ditch, in some cases.

Someone who reports a representative survey will know it (and be proud if it because it was really hard) and they’ll report the margin of error–that is, how well the sample they surveyed represents the population in which they are interested.

For a convenience (or unreported) sample method, the report can only refer to the respondents. It’s very hard to know (as a researcher and especially as a reader) how well (or if) a convenience sample represents any group beyond the respondents. So, if it’s being open and honest, the report will say something along the lines of, “out of the N respondents, X% reported….” as opposed to, “of all the [larger population], X% like…”

Convenience-sampled survey data can still be useful and identify areas of further study, but they can’t, honestly, represent a larger group.

Be critical and be informed

Now you can read survey data and interpret it with confidence. You now have a checklist for what to look for and what to look out for.

This list is just a summary. If you want to take reading survey data to the next level, try How to assess a survey report: a guide for readers and peer reviewers

 

 

Leave a Reply