You see them in the news quite often, opinion polls on politics, the environment, etc. And people place great emphasis on these results. The problem is, we shouldn’t.
Sample size issues
What you don’t know is, the majority of these polls sample a very small amount of people. For example, according to the U.S. Census Bureau 2012 population statistics, there are over 42 million people in the United States between the ages of 20 and 29. And yet, if you look at polls published regularly by the news media, their sample size is typically less than 300 people within a similar age range.
- On December 15, 2013, USA Today (in cooperation with Pew Research) published the results of a poll: Obama struggles with Millenials. The poll only surveyed 229 millenials.
- The article above cited a December 2013 Wall Street Journal/NBC poll as supporting evidence. That poll surveyed only 100 millenials.
So, less than 300 people are supposed to accurately represent the opinion of 42 million individuals.
The examples above do openly say their sample size and their margins of error, but my point is, we shouldn’t be placing such huge emphasis on polls with such a small sample size.
The other thing that always makes me very nervous about opinion polls is their collection methods. No collection method is perfect, all of them have flaws:
- Phone polls: Typically people polling only call home phones. There is a huge population of cell-phone only homes that are left out.
- Story-linked web polls: If someone clicks on a story and then takes a poll related to the story, they would be considered to have “high interest” in the story, which means the poll leaves out others who are “low interest.”
- Web polls: You have to be on the web to take them. I know that’s considered very common, but there are still populations within the U.S. who are not regularly online.
- Interception polls (such as stopping people at a mall): These polls typically end up targeting a segment of the population that has an interest in similar activities (otherwise they wouldn’t be in the same place). Some examples of this gone wrong are asking people only at a rock concert how they feel about rock music or asking consumers when they are shopping how the consumer confidence is.
A third key factor about polls is that someone is not typically going to take the time to take them if they aren’t interested in the topic. This immediately skews your results to those in the “high interest” or “strong opinion” categories.
Think about your own habits. What if you were in the middle of something and you got a call asking you to take a poll on a subject you could care less about. Would you take it? Probably not. But if it’s something you are very passionate about, you probably will.
In sum, we really need to stop promoting and emphasizing these polls. They can do more harm than good but creating assumptions that shouldn’t be created on such small amounts of information.
For more on this, visit: