Opinion | Polls are sometimes wrong. We know. We study polls

Simon Schuster is policy fellow at the Institute for Public Policy and Social Research at Michigan State University

The day after the 2016 election, after polls widely predicted Hillary Clinton to win the presidency, many pundits questioned how so many polls had gotten the result so wrong.

Paul J. Lavrakas, the senior research fellow at Michigan State University’s Office for Survey Research, has decades of experience in poll methodology and regularly co-authors a book titled The Voter’s Guide to Election Polls. As the American political landscape becomes increasingly complex, he said, pollsters face more challenges.

“You’d think over time that polls are getting more accurate, but unfortunately due to the complexities of our elections in 2016, for a variety of reasons, the opposite is happening,” Lavrakas said. “They’re getting less accurate, we have less confidence, even among the scientific polls.”

Still, that doesn’t mean polls are no longer trustworthy, as Dan Thaler, OSR’s research analyst explained.

“Even with all the complexities and pitfalls, reputable scientific polls are still able to project the winner of an election —or at least the popular vote winner — accurately the vast majority of the time,” Thaler said. “The key is knowing how to differentiate high quality polls from low quality polls.”

It’s important to realize that polls themselves can be tools of persuasion, especially in close races. Research indicates polls can influence the amount of donations a candidate receives and even whether voters show up on election day.

The Office for Social Research both conducts and collects comprehensive survey data, along with researching the complex methods behind surveys and polling, in order to better understand the effects different approaches have in producing accurate results. Their own surveys are usually carried out over several weeks, unlike most quick election polls.

With the midterm less than a week away, following a few guidelines can make it much easier to know which polls matter.

The source is crucial

A reputable source is key in assessing a poll’s trustworthiness. Ensure the poll has been conducted by a legitimate news outlet or other nonpartisan polling firm, rather than a political campaign or a partisan group.

Some groups conduct “push-polls,” where voters are contacted in the guise of conducting a poll, but instead feed the recipient biased information in order to influence their vote.

Look to the details

The number of voters being sampled, whether they survey likely voters rather than registered voters, the size of the margin of error and the options provided to respondents; they all play a role in a poll’s accuracy. Whether the poll was by phone, mail or online may also matter.

Though a poll’s full methodology is rarely provided, margin of error, which in actuality is the margin of sampling error, is regularly available. For instance, say Candidate A is polled at 54 percent, while Candidate B is polled at 46 percent. Yet the margin of sampling error is ±5 percentage points. Although candidate A appears to be ahead, statistically they’re tied — something that media outlets don’t consistently report.

It means candidate A’s real level of support is likely between 49 and 59 percent. Candidate B’s real level, then, is 41 and 51 percent. But this only gauges the error in sampling voters. Other kinds of errors can also skew results.

Many more complex factors can influence poll results

The wording of questions, the order in which they’re asked and even the order in which candidates’ names are read all have an impact on results, though it’s more difficult to gauge. Even the number of times a phone number is called back when no one picks up can change the demographics of a sample.

“We find we reach different kinds of people depending upon the number of calls we make,” Lin Stork, OSR’s director, said. “Minority households don’t usually respond on the first call.”

Whether “undecided” is a reported response in polls also changes the result. Some pollsters don’t report undecided voters and move these responses into support for a candidate by their party identification. Others shift undecided responses to the candidate the voter is leaning toward.

Multiple polls provide a better picture

“One thing that I see too often is cherry-picking the polls that are most favorable,” Dan Thaler, OSR’s research analyst, said. “A poll comes out that you like and even subconsciously that’s the one that your brain’s going to latch on to.”

To avoid confirmation bias, polling aggregators such as RealClearPolitics and FiveThirtyEight list most of the available polls for many major races.

“That’s one thing that these polling aggregators are good for, not overemphasizing one poll that stands out because it looks good to me,” Thaler said.

Understand polls are ultimately an estimate

Polls question a sample of voters, in an effort to understand how the whole population will vote. There is always uncertainty in those results, along with room for opinions to shift before election day. That doesn’t make their insights any less worthwhile.

Bridge welcomes guest columns from a diverse range of people on issues relating to Michigan and its future. The views and assertions of these writers do not necessarily reflect those of Bridge or The Center for Michigan.

Like what you’re reading in Bridge? Please consider a donation to support our work!

We are a nonprofit Michigan news site focused on issues that impact all citizens. In an era of click bait and biased news, we focus on taking the time to learn both sides of a story before we post it. Bridge stories are always free, but our work costs money. If our journalism helps you understand and love Michigan more, please consider supporting our work. It takes just a moment to donate here.

Pay with VISA Pay with MasterCard Pay with American Express Donate now

Comment Form

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Comments

Dennise
Tue, 10/30/2018 - 9:10am

I have Nomorobo on my phones, so I never get called. I'm guessing a large portion of the voting population is missed by call blocking.

John Gorentz
Tue, 10/30/2018 - 9:42am

Last time I got a call from a pollster I didn't have time to participate, as I needed to get to my bicycling destination before it got dark. (I ended up riding the last few miles in the dark anyway.) But I told the pollster that even if I had time, I'd lie to her, anyway, because I believe it's important to lie to pollsters. She laughed.
I recommend that everyone else do the same.

Agnosticrat 2.0
Tue, 10/30/2018 - 10:59am

John Gorentz... admitted liar!
I'll bet you can't wait to see that in a political ad.