Opinion | Polls are sometimes wrong. We know. We study polls

Simon Schuster is policy fellow at the Institute for Public Policy and Social Research at Michigan State University

The day after the 2016 election, after polls widely predicted Hillary Clinton to win the presidency, many pundits questioned how so many polls had gotten the result so wrong.

Paul J. Lavrakas, the senior research fellow at Michigan State University’s Office for Survey Research, has decades of experience in poll methodology and regularly co-authors a book titled The Voter’s Guide to Election Polls. As the American political landscape becomes increasingly complex, he said, pollsters face more challenges.

“You’d think over time that polls are getting more accurate, but unfortunately due to the complexities of our elections in 2016, for a variety of reasons, the opposite is happening,” Lavrakas said. “They’re getting less accurate, we have less confidence, even among the scientific polls.”

Still, that doesn’t mean polls are no longer trustworthy, as Dan Thaler, OSR’s research analyst explained.

“Even with all the complexities and pitfalls, reputable scientific polls are still able to project the winner of an election —or at least the popular vote winner — accurately the vast majority of the time,” Thaler said. “The key is knowing how to differentiate high quality polls from low quality polls.”

It’s important to realize that polls themselves can be tools of persuasion, especially in close races. Research indicates polls can influence the amount of donations a candidate receives and even whether voters show up on election day.

The Office for Social Research both conducts and collects comprehensive survey data, along with researching the complex methods behind surveys and polling, in order to better understand the effects different approaches have in producing accurate results. Their own surveys are usually carried out over several weeks, unlike most quick election polls.

With the midterm less than a week away, following a few guidelines can make it much easier to know which polls matter.

The source is crucial

A reputable source is key in assessing a poll’s trustworthiness. Ensure the poll has been conducted by a legitimate news outlet or other nonpartisan polling firm, rather than a political campaign or a partisan group.

Some groups conduct “push-polls,” where voters are contacted in the guise of conducting a poll, but instead feed the recipient biased information in order to influence their vote.

Look to the details

The number of voters being sampled, whether they survey likely voters rather than registered voters, the size of the margin of error and the options provided to respondents; they all play a role in a poll’s accuracy. Whether the poll was by phone, mail or online may also matter.

Though a poll’s full methodology is rarely provided, margin of error, which in actuality is the margin of sampling error, is regularly available. For instance, say Candidate A is polled at 54 percent, while Candidate B is polled at 46 percent. Yet the margin of sampling error is ±5 percentage points. Although candidate A appears to be ahead, statistically they’re tied — something that media outlets don’t consistently report.

It means candidate A’s real level of support is likely between 49 and 59 percent. Candidate B’s real level, then, is 41 and 51 percent. But this only gauges the error in sampling voters. Other kinds of errors can also skew results.

Many more complex factors can influence poll results

The wording of questions, the order in which they’re asked and even the order in which candidates’ names are read all have an impact on results, though it’s more difficult to gauge. Even the number of times a phone number is called back when no one picks up can change the demographics of a sample.

“We find we reach different kinds of people depending upon the number of calls we make,” Lin Stork, OSR’s director, said. “Minority households don’t usually respond on the first call.”

Whether “undecided” is a reported response in polls also changes the result. Some pollsters don’t report undecided voters and move these responses into support for a candidate by their party identification. Others shift undecided responses to the candidate the voter is leaning toward.

Multiple polls provide a better picture

“One thing that I see too often is cherry-picking the polls that are most favorable,” Dan Thaler, OSR’s research analyst, said. “A poll comes out that you like and even subconsciously that’s the one that your brain’s going to latch on to.”

To avoid confirmation bias, polling aggregators such as RealClearPolitics and FiveThirtyEight list most of the available polls for many major races.

“That’s one thing that these polling aggregators are good for, not overemphasizing one poll that stands out because it looks good to me,” Thaler said.

Understand polls are ultimately an estimate

Polls question a sample of voters, in an effort to understand how the whole population will vote. There is always uncertainty in those results, along with room for opinions to shift before election day. That doesn’t make their insights any less worthwhile.

Bridge welcomes guest columns from a diverse range of people on issues relating to Michigan and its future. The views and assertions of these writers do not necessarily reflect those of Bridge or The Center for Michigan. Bridge does not endorse any individual guest commentary submission.

If you are interested in submitting a guest commentary, please contact Monica WilliamsClick here for details and submission guidelines.

Facts matter. Trust matters. Journalism matters.

If you learned something from the story you're reading please consider supporting our work. Your donation allows us to keep our Michigan-focused reporting and analysis free and accessible to all. All donations are voluntary, but for as little as $1 you can become a member of Bridge Club and support freedom of the press in Michigan during a crucial election year.

Pay with VISA Pay with MasterCard Pay with American Express Donate now

Comment Form

Add new comment

Dear Reader: We value your thoughts and criticism on the articles, but insist on civility. Criticizing comments or ideas is welcome, but Bridge won’t tolerate comments that are false or defamatory or that demean, personally attack, spread hate or harmful stereotypes. Violating these standards could result in a ban.

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.


Tue, 10/30/2018 - 9:10am

I have Nomorobo on my phones, so I never get called. I'm guessing a large portion of the voting population is missed by call blocking.

John Gorentz
Tue, 10/30/2018 - 9:42am

Last time I got a call from a pollster I didn't have time to participate, as I needed to get to my bicycling destination before it got dark. (I ended up riding the last few miles in the dark anyway.) But I told the pollster that even if I had time, I'd lie to her, anyway, because I believe it's important to lie to pollsters. She laughed.
I recommend that everyone else do the same.

Agnosticrat 2.0
Tue, 10/30/2018 - 10:59am

John Gorentz... admitted liar!
I'll bet you can't wait to see that in a political ad.