
Not all polls or surveys are scientific or newsworthy. Often companies will sponsor polls on their own websites and send them out strictly for marketing purposes. Some organizations will carefully select their respondents in order to get the results they want. Don’t let pollsters and marketers lead you astray with questionable “facts” and stats. Ask yourself some questions before you agree to do an article.
Also remember that no matter how credible the poll, the reality may be very different. Anyone who has followed election results closely will be able to cite times in history where the pollsters were completely wrong. Taking a telephone poll is a very different activity than marching down to the voting place and filling out the ballot. When asked about personal habits such as how healthy they eat, many people will exaggerate how well they are doing, hang up the phone and go back to their bag of potato chips. A poll can never give you a complete picture.
Who sponsored this poll and why?
Advocacy groups will often sponsor polls in order to make a point. Be careful that you aren’t being used to present un-objective data as fact.
What was the sample size?
Obviously, the fewer the people, the less accurate the poll will be in gauging public opinion.
What is the margin of error?
Reliable polling firms will provide a margin of error with their results, which is based primarily on the sample size. If you are reporting on elections you might find that the difference between two candidates is within the sampling error which means the election is extremely close. However, it’s also important to note that elections are not always decided by popular vote due to the electoral systems in place. The most famous example, of course, is the 2016 presidential election in which many pollsters were predicting a win for Hillary Clinton who did, in fact, win the popular vote by nearly 2.9 million votes. Then, of course, there is the undecided voter who could go either way on election day or not show up at all.
How were the participants selected?
How the poll is conducted can have major impacts on results. For example, a telephone survey is more likely to connect with older people because they are more likely to have landlines in their home. An Internet survey can bring in more young people, but the results may not be as accurate. Also, many Internet surveys are incentivized in some way with points, giveaways or payouts, which attracts a certain type of respondent.
Asking people to respond to a poll on a website or social media site, can produce biased results because primarily the people following that website or account will respond. Sometimes groups with a particular point of view will hijack a voluntary poll and get all the members of their group to respond to it to make it appear that the majority shares their view.
To counter this, pollsters will often set quotas. For example, they will cap the number of female respondents at a percentage that matches the number of women in the actual population.
Approaching people in a shopping mall or other public place may not be accurate because the person may be reticent to answer personal questions or may be more likely to give an opinion they think the pollster would approve of. Also the pollster may be more likely to approach people based on how similar to themselves they look. They may unconsciously select people from the same race, age range or socio-economic background.
What were the questions?
Leading questions are a big problem in polling and surveying. Sometimes when you are taking a poll you can’t find any answer that would suit your opinion well because the questions are designed to portray one side of the issue as undesirable and the other as desirable. Facts and figures that may not be entirely accurate could also be added to the poll to help sway you to one side of an issue.
When was the poll conducted?

In the world of elections public opinion changes constantly because the candidates are campaigning and in the public eye constantly. One mishap can have disastrous results. New opinion polls will be released constantly up until the vote or the legal cut-off point to make up for the day-to-day changes. Other types of polling data change less frequently, but still the date should be taken into consideration. Things like health and fitness, retail spending, job markets and political opinions are subject to forces like public awareness campaigns, government funding, economy and advertising.
You do not necessarily have to dissect all of the above factors in your article, but your article should mention whether it was a scientific poll and how the poll was conducted. Non-scientific polls can be interesting in their own right even if it is just for a light-hearted story that gets people talking. For example, my city of Vancouver was once voted third worst-dressed in North America and every spring automobile associations take in votes on the biggest potholes. As long as your reader knows the quality and purpose of the poll, don’t worry about demanding scientific data from your source.
On the other hand, if a poll exists solely to promote a certain viewpoint, you might want to consider if it is worthwhile or just a form of aggressive PR and lobbying. The other factor to keep in mind is newsworthiness. Public opinions exist on every subject. In order to get covered, the poll needs to be new and noteworthy.
Lastly, if you are posing these questions to a person providing you with the results of a poll and they become evasive or refuse to answer, you should be very wary of their intent and not publish their results.
If you are interested in learning more about reporting and journalism, check out the ebook, How to Become a Journalist without a Degree.
Very detailed and insightful article on polls! Thank you for sharing!
Thanks for your feedback!