The findings in public opinion polls are a fixture in the news media, with pollsters measuring the public's thoughts on politicians, public policy and pop culture. But how does polling work? And how accurate is it?
Poll results are a staple of news coverage during an election season.The headlines on newspapers or Internet news sites scream the headlines: “Candidate A surges ahead.” “Candidate B drops in latest poll.”“Majority supports issue C.”
Or think of the news anchors, breathlessly announcing polls results. “Breaking news tonight, a new polls finds…” You know what happens when the results of a new poll are released to the news media.But it turns out it is not easy to get those numbers. It all starts in a large room on the first floor of a classroom building at Elon University.Students are seated in cubicles, with telephones and computer screens in front of them. The screens have the script with all of the poll questions.
The student dials the next number that appears on their list and then waits, hoping someone will answer.If a person answers, the student begins the sales pitch to ask if they want to participate in the poll. It is not easy, as Jason Husser, Assistant Professor of Political Science and Policy Studies finds as he looks over the logs of calls that have been made.
“So what we find is that of the 17,000 numbers we’ve called so far in this survey, in 7000 calls nobody answered, there was just nobody there,” says Husser, as he moves the mouse to look through the list. “In 3500 cases people refused to participate, 569 numbers were busy, 2000 lines weren’t working at all and 200 people who answered didn’t speak English. So we have made about 17,000 calls to get the 788 responses we’ve achieved so far for this survey.”
It takes 40 students, calling people on landlines and cell phones, five nights to complete the survey.They will make contact with people in about 10-thousand cases in order to get responses from 1000 people, which is the goal of the survey.
The room buzzes with the sounds of surveys.
“And what do you think is the most important issue in the U.S.?”
“Do you approve or disapprove of the way the General Assembly is doing its job?”
“Do you oppose or support gay marriage?”
In talking with students and professors, you find that polling is frustrating and requires students to be trained not only in how to be patient, but also in the skills of asking questions.
“I think the Elon Poll is great,” says Matthew Albers, a senior who has worked on several polls and is now a supervisor. “It’s academic, we’re not trying to get you to think one way or another, we word the questions as unbiased as possible.”
Polling is also very labor intensive, which means it is expensive. That’s probably why there are not many polls similar to the Elon Poll being conducted. Live caller polls, where an actual human is making the calls to land lines and cell phones, tend to be the most accurate, but they also take a lot of time.
“I really like math and I really like political science, and I like how they intersect and for me polling is the obvious sign of that,” adds Maggie Macdonald, who is also a senior and a supervisor. “So for me this made my statistics and political science classes real.”
Opinion polls, such as the Elon Poll, are based in science; mathematics, social science, statistics, and probability.But the result of this measurement of public opinion is not the discovery of an unchanging truth or a science based fact.An opinion poll produces a science-based glimpse of what people are thinking at a certain moment. It is making sense out of random data.
Think of it as snapshot of the public’s opinion at one moment in time.
The goal is to measure the opinion of a large group of people by sampling the opinion of a much smaller group.To make it all work, the poll must ensure the sample group accurately represents the large population.
The Elon poll buys a list of cell phone and landline phone numbers from a private company. Random numbers are drawn and called until the goal of 1000 contacts is reached.
Polling science shows that 1000 responses creates a manageable uncertainty factor. That’s the “plus or minus” margin of error you always read or hear when the results of a poll are released.Interviewing a larger number of people doesn’t change the figure much at all, and it adds to the cost of the poll.
The responses are matched to the results of the U-S Census.
“The census is our primary weighing point, so we make sure our sample matches North Carolina in terms of age, race and gender,” says Husser. “So you will never see an Elon poll that has too many women, or men, or African Americans. The percentage we release at the end will perfectly match the census data for the state.”
And that numbers includes cell phones and land lines.If the students called all land line numbers, a large number of young people would be left out of the poll, because young people tend to have more cell phones than land lines.
There’s another check that is performed to increase the accuracy of the poll.
The survey questions are pre-tested to make sure the question itself doesn’t affect the polls findings. Simply adding or subtracting a word can change the response.
“Even a question as simple as ‘do you own a library card” can throw a response because it puts somebody on the spot,” explains Kenneth Fernandez, Assistant Professor of Political Science and Policy Studies and the Director of the Elon Poll. “That’s because the question has them wondering ‘why are they asking me that?’And then they start thinking that a library card is really important and perhaps they should own a library card. And so to keep themselves from looking bad they will lie.”
It turns out that a certain percentage of people will lie to an interviewer who is conducting a poll so they don’t look bad.So poll questions are pre-tested to ensure they are unbiased and deliver an accurate response.
It takes about one week to compile the findings. The poll is then released and it becomes part of the news cycle.
Tips for Making Sense of Opinion Polls by Frank Graff
Lee Atwater, the political consultant who helped guide Ronald Reagan to the White House, once famously told a reporter “perception is reality.”Take a look at the daily news cycle these days and you will realize how true his comment is.
The news is jam-packed with polls. Whether you are watching a nightly newscast or looking at the newspaper or browsing only news articles, it won’t take long to find out the percentage of Americans who believe a government policy is good/bad; the breakdown of people who follow a new diet plan, the percentage of working people between the ages of 45-65 who think they are saving enough for retirement.
And this year, the level of polling news will dramatically increase because it’s an election year.In fact, depending on the findings of the poll, the survey itself can become the headline: “The Race is a Dead Heat!”, “Candidate X Pulls Ahead!” You get the idea.
The trouble is, there are so many polls conducted by such a wide variety of organizations, it is difficult to know just what to believe from all of the information that you are hearing. What’s more, polls are used to not only measure public opinion but also to shape public opinion.That’s why it pays to be a smart consumer of polling news.
So, with help from the folks at Elon who survey North Carolina residents, here are a few thoughts to consider when reading about poll results:
1.Who conducted the poll? Make sure the organization has a good track record. Media organizations and universities are one thing. A private polling firm contracted by a party or candidate should be viewed with a little more skepticism.
2.What’s the methodology?In other words, how was the poll conducted? Did the pollster sample only telephone users with landlines? How many people were surveyed? Answers to all of those questions will influence the results.
3.What’s the margin of error?Since pollsters only question a small number of people to reach a statistical sample of the population, the margin of error reflects the pollster’s confidence that the sample reflects the entire population.The more people who are polled, the smaller the margin of error will be.
4.Look carefully at the question that was asked. Is it pretty straightforward or does it suggest an answer that caters to the organization sponsoring the poll?
5.If the question doesn’t seem to make sense to you, chances are it was confusing to the people who answered the poll.
6.When the poll was conducted? If it was close to a news event, such as a debate, an attack, or a disaster, chances are the responses to the poll will be distorted.