Skip to main content
Bridge Michigan
Michigan’s nonpartisan, nonprofit news source

A Bridge primer on how to gauge the credibility of political polls

Breaking News! A new poll shows Candidate A crushing Candidate B just days before the election, with what looks like an insurmountable lead.

“Why bother voting next Tuesday?” you murmur. “It’s already decided.”

Actually, the question you should be asking is: “Can I really trust this poll?”

Come election time, hundreds of companies nationwide are contacting potential voters, trying to divine the outcome before any ballots are counted. And almost daily some website or media outlet writes about, reports or simply tweets – in 140 characters or less – the “news” of who’s up and who’s down. Context is often a victim of brevity.

Before accepting the latest poll as gospel, know this: “The numbers are not all created equal,” said Arthur Lupia, a professor of political science at the University of Michigan and a research professor at the school’s Institute for Social Research.

Consider the following scenario: What if the poll you’re following before the Nov. 4 midterm elections:

  • Only called people with landlines
  • Only sought out as “likely voters” those residents who had previously voted in a presidential election
  • Contacted potential voters through an online survey

Each of these decisions could seriously degrade the reliability of that poll. How?

A polling firm that didn’t call mobile phones would be missing a wide swath of voters, especially, according to research, younger and minority voters.

A firm that interviewed people based on whether they turned out for the last presidential election would also risk error – as midterm voters can be a very different group from voters in presidential elections, with experts expecting a more concentrated Republican turnout this year.

Online or computer-directed polls can be more prone to error.

“The average citizen doesn’t have the background and the training to judge the quality of polls,” said Michael Traugott, a professor and senior research scientist in the Center for Political Studies at the Institute for Social Research at UM.

To fill in that gap, he co-authored “The Voter’s Guide to Election Polls” with Paul Lavrakas, laying out how anyone can critically assess polls. It’s a pretty tall order.

Traugott’s book is aimed primarily at journalists – the gatekeepers, in most instances, of what polls the public hears about. He provides a dozen questions to ask before deciding whether the poll’s results should be published at all. That includes questions about a poll’s methodology, information not always vetted by voters. But that doesn’t mean the voting public can’t benefit from raising a few questions of its own.

Here are some factors voters should consider as Election Day nears.

12 Questions to ask when reading political polls

Size matters – to a point

For most people, judging the quality of a poll might begin and end with knowing the margin of error. But that’s only a subset of what can make or break a poll. “You shouldn’t use the margin of error as an indicator of quality,” Traugott said.

Polls with 600 or more respondents can usually produce margins of error within a few points; low enough to instill confidence. But even a survey of 2,000 respondents can be flawed if questions are poorly worded, asked in a leading manner, or if the polling firm surveys people who are not representative of those who will actually vote.

Who will vote

In addition to making sure the sample of people surveyed creates a geographic and demographic match to the electorate, pollsters must get the right electorate. To do so, they try to identify “likely voters.” That can sometimes prove tricky, especially when trying to calibrate young voters.

In 2008, many polls had President Obama ahead, but almost none predicted the size of his victory, in part because no one anticipated so many new voters – and in part because the energized youth vote was screened from many polls because they’d never voted before.

A statewide poll of Michigan, for instance, in which 7 percent of respondents are Detroiters, looks pretty reliable – if it’s conducted during a presidential election.That’s very close to the 6.6 percent of Detroit residents who voted in 2008, when President Obama first ran.

But in the midterm elections of 2010, when Gov. Rick Snyder swept to office, Detroiters comprised only 5.4 percent of the statewide vote – 157,440 fewer Detroiters cast ballots than in the 2008 presidential election and 113,000 fewer than would in the 2012 presidential race. So gauging the anticipated Detroit vote based on turnout during a presidential year is probably going to be less reliable than looking to past numbers during a midterm.

Red flags

Few voters are going to dive into the methodologies followed by polling firms, and sometimes the specifics are not fully disclosed. As a result, too often voters are left to evaluate a few clues given in news accounts. What should voters be asking? Consider these questions:

Did the polling firm use live interviewers to ask the questions?

Or was the poll based on so-called “robo calls” conducted by computers? Such polls are cheaper to perform, but are considered by many to be less reliable. For example, because it is illegal to call a cell phone with a computer, Traugott said such automated polls can produce less reliable results. A third of all adults rely entirely on cell phones, and those numbers go up for young people and minorities, making such polls more likely to undercount Democrats.

Is a link provided to the full range of questions asked of respondents, and the order of those questions? Reputable polling firms are transparent in how they conduct their polls. Indeed, many subscribe to an industry code on transparency, such as that of the American Association for Public Opinion Research.

Were the questions asked in a neutral way, or did they appear to lead the respondents to a particular answer?

Was the poll conducted by an independent polling firm, or by companies with a stake in the outcome, such as political campaigns, or an industry or advocacy group?

The major newspapers and television stations generally hire well-known, independent-minded pollsters to conduct polls. But lesser known outfits – some hired by candidates – are also polling. And often, in the days leading up to an election, a losing candidate’s campaigns will start whispering to journalists about their own internal polls. The race is tightening, they’ll say, without divulging data. Traugott said those polls are more advertising than a reflection of voter sentiment.

“I think the most common problems are polls which are released by the campaigns,” Traugott said.

A word more about leading questions. Let’s look at how both the order and nature of the questions could affect responses in a recent example. In August, a number of media outlets in the United Kingdom touted a “new poll” that showed more than half of the population (57 percent) supported “fracking” to get at shale oil reserves.

What the stories didn’t say was that respondents were asked the question after first being asked to agree with leading statements, including: “The UK needs to invest more in a whole range of new infrastructure, including housing, roads and railways, airport capacity and new energy sources.”

They were then asked about fracking, a process the poll defined as working with “tiny fractures in the rock deep underground” using “non-hazardous materials.” Then came this statement:

“The British Geological Survey has estimated that the UK has 1,300 trillion cubic feet of natural gas from shale. If just 10% of this could be recovered, it would be enough to meet the UK's demand for natural gas for nearly 50 years or to heat the UK's homes for over 100 years.”

Only then did the pollsters, hired by the oil and gas industry, ask: “From what you know, do you think the UK should produce natural gas from shale?”

The lesson: beware of how questions are asked – and that caution holds for polling performed on both sides of an issue or political race.

As Nov. 4 approaches, expect more polls and more poll reporting. Is Snyder increasing his lead? Does Terri Lynn Land have a chance? Before you decide to believe what you read, it’s important for the media and voters alike to ask a few questions.

How impactful was this article for you?

Only donate if we've informed you about important Michigan issues

See what new members are saying about why they donated to Bridge Michigan:

  • “In order for this information to be accurate and unbiased it must be underwritten by its readers, not by special interests.” - Larry S.
  • “Not many other media sources report on the topics Bridge does.” - Susan B.
  • “Your journalism is outstanding and rare these days.” - Mark S.

If you want to ensure the future of nonpartisan, nonprofit Michigan journalism, please become a member today. You, too, will be asked why you donated and maybe we'll feature your quote next time!

Pay with VISA Pay with MasterCard Pay with American Express Pay with PayPal Donate Now