A Bridge primer on how to gauge the credibility of political polls

Breaking News! A new poll shows Candidate A crushing Candidate B just days before the election, with what looks like an insurmountable lead.

“Why bother voting next Tuesday?” you murmur. “It’s already decided.”

Actually, the question you should be asking is: “Can I really trust this poll?”

Come election time, hundreds of companies nationwide are contacting potential voters, trying to divine the outcome before any ballots are counted. And almost daily some website or media outlet writes about, reports or simply tweets – in 140 characters or less – the “news” of who’s up and who’s down. Context is often a victim of brevity.

Before accepting the latest poll as gospel, know this: “The numbers are not all created equal,” said Arthur Lupia, a professor of political science at the University of Michigan and a research professor at the school’s Institute for Social Research.

Consider the following scenario: What if the poll you’re following before the Nov. 4 midterm elections:

  • Only called people with landlines
  • Only sought out as “likely voters” those residents who had previously voted in a presidential election
  • Contacted potential voters through an online survey

Each of these decisions could seriously degrade the reliability of that poll. How?

A polling firm that didn’t call mobile phones would be missing a wide swath of voters, especially, according to research, younger and minority voters.

A firm that interviewed people based on whether they turned out for the last presidential election would also risk error – as midterm voters can be a very different group from voters in presidential elections, with experts expecting a more concentrated Republican turnout this year.

Online or computer-directed polls can be more prone to error.

“The average citizen doesn’t have the background and the training to judge the quality of polls,” said Michael Traugott, a professor and senior research scientist in the Center for Political Studies at the Institute for Social Research at UM.

To fill in that gap, he co-authored “The Voter’s Guide to Election Polls” with Paul Lavrakas, laying out how anyone can critically assess polls. It’s a pretty tall order.

Traugott’s book is aimed primarily at journalists – the gatekeepers, in most instances, of what polls the public hears about. He provides a dozen questions to ask before deciding whether the poll’s results should be published at all. That includes questions about a poll’s methodology, information not always vetted by voters. But that doesn’t mean the voting public can’t benefit from raising a few questions of its own.

Here are some factors voters should consider as Election Day nears.

12 Questions to ask when reading political polls

Size matters – to a point

For most people, judging the quality of a poll might begin and end with knowing the margin of error. But that’s only a subset of what can make or break a poll. “You shouldn’t use the margin of error as an indicator of quality,” Traugott said.

Polls with 600 or more respondents can usually produce margins of error within a few points; low enough to instill confidence. But even a survey of 2,000 respondents can be flawed if questions are poorly worded, asked in a leading manner, or if the polling firm surveys people who are not representative of those who will actually vote.

Who will vote

In addition to making sure the sample of people surveyed creates a geographic and demographic match to the electorate, pollsters must get the right electorate. To do so, they try to identify “likely voters.” That can sometimes prove tricky, especially when trying to calibrate young voters.

In 2008, many polls had President Obama ahead, but almost none predicted the size of his victory, in part because no one anticipated so many new voters – and in part because the energized youth vote was screened from many polls because they’d never voted before.

A statewide poll of Michigan, for instance, in which 7 percent of respondents are Detroiters, looks pretty reliable – if it’s conducted during a presidential election.That’s very close to the 6.6 percent of Detroit residents who voted in 2008, when President Obama first ran.

But in the midterm elections of 2010, when Gov. Rick Snyder swept to office, Detroiters comprised only 5.4 percent of the statewide vote – 157,440 fewer Detroiters cast ballots than in the 2008 presidential election and 113,000 fewer than would in the 2012 presidential race. So gauging the anticipated Detroit vote based on turnout during a presidential year is probably going to be less reliable than looking to past numbers during a midterm.

Red flags

Few voters are going to dive into the methodologies followed by polling firms, and sometimes the specifics are not fully disclosed. As a result, too often voters are left to evaluate a few clues given in news accounts. What should voters be asking? Consider these questions:

Did the polling firm use live interviewers to ask the questions?

Or was the poll based on so-called “robo calls” conducted by computers? Such polls are cheaper to perform, but are considered by many to be less reliable. For example, because it is illegal to call a cell phone with a computer, Traugott said such automated polls can produce less reliable results. A third of all adults rely entirely on cell phones, and those numbers go up for young people and minorities, making such polls more likely to undercount Democrats.

Is a link provided to the full range of questions asked of respondents, and the order of those questions? Reputable polling firms are transparent in how they conduct their polls. Indeed, many subscribe to an industry code on transparency, such as that of the American Association for Public Opinion Research.

Were the questions asked in a neutral way, or did they appear to lead the respondents to a particular answer?

Was the poll conducted by an independent polling firm, or by companies with a stake in the outcome, such as political campaigns, or an industry or advocacy group?

The major newspapers and television stations generally hire well-known, independent-minded pollsters to conduct polls. But lesser known outfits – some hired by candidates – are also polling. And often, in the days leading up to an election, a losing candidate’s campaigns will start whispering to journalists about their own internal polls. The race is tightening, they’ll say, without divulging data. Traugott said those polls are more advertising than a reflection of voter sentiment.

“I think the most common problems are polls which are released by the campaigns,” Traugott said.

A word more about leading questions. Let’s look at how both the order and nature of the questions could affect responses in a recent example. In August, a number of media outlets in the United Kingdom touted a “new poll” that showed more than half of the population (57 percent) supported “fracking” to get at shale oil reserves.

What the stories didn’t say was that respondents were asked the question after first being asked to agree with leading statements, including: “The UK needs to invest more in a whole range of new infrastructure, including housing, roads and railways, airport capacity and new energy sources.”

They were then asked about fracking, a process the poll defined as working with “tiny fractures in the rock deep underground” using “non-hazardous materials.” Then came this statement:

“The British Geological Survey has estimated that the UK has 1,300 trillion cubic feet of natural gas from shale. If just 10% of this could be recovered, it would be enough to meet the UK's demand for natural gas for nearly 50 years or to heat the UK's homes for over 100 years.”

Only then did the pollsters, hired by the oil and gas industry, ask: “From what you know, do you think the UK should produce natural gas from shale?”

The lesson: beware of how questions are asked – and that caution holds for polling performed on both sides of an issue or political race.

As Nov. 4 approaches, expect more polls and more poll reporting. Is Snyder increasing his lead? Does Terri Lynn Land have a chance? Before you decide to believe what you read, it’s important for the media and voters alike to ask a few questions.

Facts matter. Trust matters. Journalism matters.

If you learned something from the story you're reading please consider supporting our work. Your donation allows us to keep our Michigan-focused reporting and analysis free and accessible to all. All donations are voluntary, but for as little as $1 you can become a member of Bridge Club and support freedom of the press in Michigan during a crucial election year.

Pay with VISA Pay with MasterCard Pay with American Express Donate now

Comment Form

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Comments

Tue, 10/28/2014 - 7:22am
May I suggest a brief summary of most polls? Garbage in, Gospel out.
Nick Ciaramitaro
Tue, 10/28/2014 - 10:01am
At this point the only poll worth considering is the one on November 4th. The size of the sample -- everyone -- is reliable and if you don't participate you have only yourself to blame.
Tue, 10/28/2014 - 10:13am
We do large sample polling and have found a way to weight automated (robo) polls so that they provide our political and media clients with accurate, inexpensive polling. We only poll likely voters in the election coming up. We work diligently to weight the data to reflect voters 18-29 and 30-39 -- those least likely to have land lines.
Tue, 10/28/2014 - 10:33am
Sadly, push-polling has become the norm. It would appear the ability to construct a fair, objective question has lost its utility. Taking a poll is nothing more than a partisan interactive robo-call. And when a respected pollster like Bernie Porn releases results, the response is to attack the messenger.
Tue, 10/28/2014 - 10:37am
This story is a great idea and a public service. It reminded me we should probably re-distribute an article on understanding polls from MichiganScience magazine, formerly published by the Mackinac Center. http://www.mackinac.org/9262
Glenn
Tue, 10/28/2014 - 12:45pm
Yeah. That's a reliable, unbiased source.
Willtyler
Tue, 10/28/2014 - 10:50am
A timely article and well presented. “Can I really trust polls?” is the important question, and may have been a better headline. The effect on how people choose to vote based on polling data should not be underestimated, as it can often be a deciding factor in elections. “Why bother voting next Tuesday? ... It’s already decided”, is certainly a common mindset of the electorate. The coercive influence of manipulated polling data is something the media have been lax in covering. One point not mentioned is the polling questions NOT asked. For example, when polling political races the question asked most often is "do you support Republican candidate X, Democrat candidate Y, "other", or "undecided". Rarely are candidates from minor parties mentioned, which leads to exclusion from debates of minor party and independent candidates because they did not reach an arbitrary poll percentage. The Catch 22 is that if they are not included in the poll, they can never reach the polling numbers "required" to be included in debates. The public is being hoodwinked by this system, which should be exposed by more articles such as this that address the issue. It would be more equitable and in the best public interest to invite all candidates who are on the ballot to any public debate, regardless of poll numbers.
Mike Wilkinson
Tue, 10/28/2014 - 11:25am
We were aware of these rankings but did not include them because the firms could have made changes in methodology that are not reflected in past performances. It will be interesting to see if any polling group makes changes in the wake of this year's election because the automated polls require more 'weighting' to get at younger voters who typically are cell-phone only and cannot be contacted via automated polling.
Kara Douma
Tue, 10/28/2014 - 11:49am
Good primer. These issues are being hotly debated within the survey research profession at this time. But, AAPOR, the premiere academic and professional organization for survey research practitioners, has addressed many of these issues with regard to: •Only called people with landlines •Only sought out as “likely voters” those residents who had previously voted in a presidential election •Contacted potential voters through an online survey While it is believed eventually polling will move to all on-line, we are not there, yet. And, while weighting can be used for small adjustments to the sample so it more accurately reflects the population in question, it CANNOT be used to correct for non-coverage bias.
Mike Wilkinson
Tue, 10/28/2014 - 12:01pm
Exactly. How do we know that the young people who did answer a landline and are used to reflect all young people have the same characteristics of cell-phone only youth? That's something for the social scientists to determine. Maybe after the election.
Kara Douma
Tue, 10/28/2014 - 6:51pm
EXACTLY.
Barry
Tue, 10/28/2014 - 4:09pm
"Do you think the country (or State) is going in the right direction?" is my favorite push poll question because it is automatically assumed to reflect on the President (or Governor). I answer this question with a resounding "no" but am aiming my displeasure at a recalcitrant and obstructive Congress (or Legislature), particularly the Republican "do nothing that makes a difference or might benefit the citizenry" Party.
Kara Douma
Tue, 10/28/2014 - 6:51pm
538's rankings were an academic exercise in modeling. Some firms were "graded" on only a handful of surveys released as many as 15 years ago. It was an interesting exercise, but misinterpreted by some in the media and others.
JGunn
Sun, 11/02/2014 - 10:46am
Here's another related problem - aspiring politicians who hire a polling company to "test the waters" before they decide to enter a race. An unscrupulous pollster will ask "leading" questions, then come up with data that looks favorable, and then the politician decides to go ahead and hires the pollster for a whole campaign. The pollster gets a reliable and lucrative stream of revenue for the duration of the campaign and the naive would-be politician wastes his/her money and time. I saw that happen in Maryland - an ambitious neophyte tried to run against a popular incumbant in a primary. Pretty sad, really.
JGunn
Sun, 11/02/2014 - 10:54am
For economists - hiring a polling company suffers from a classic principle-agent problem: they can cut corners and and deliver plausible-looking data that is actually misleading (i.e., far from the truth). If you really want honest numbers, you need to find a polling firm with a solid reputation. If you're a consumer, you need to stick to the old rule: caveat emptor (let the buyer, or in this case the reader, beware). Here's an analogy: remember the earthquakes in S. W. China a few years ago? Many large buildings collapsed while others, built at the same time, withstood the quake. Unfortunately, many publicly-owned buildings (built by building contractors who paid bribes to get their contracts) including schools, were the ones that collapsed, killing many students.