DANIEL VAUGHAN: The polls are wrong on the midterms

The NASA DART mission successfully crashed into an asteroid for the express purpose of moving, attempting to shift the path of that rock. If you watch the video, NASA engineers managed to launch their “dart” ten months ago and hit a bullseye 6.8 million miles away from earth. It’s an incredible feat and devotion to a precision that I wish pollsters in the United States shared.

Who you poll matters.

We’re about to enter the month of October, nearly a month away from the midterm elections. Instead of providing an accurate picture of what Americans are playing, pollsters throw out junk polls. We know this purely by glancing at the methodology: pollsters are still running Registered Voter screens instead of Likely Voter ones.

What’s the difference, you ask? It’s one of accuracy. When polling Americans, you decide who to include in a sample. You can poll everyone, poll only those registered to vote, or interview people who say they’re likely to vote in the upcoming election.

As Gallup points out, polling everyone is a fool’s errand for elections. “It’s unrealistic to do so because we know that a percentage of these national adults not only won’t vote, but can’t vote — because they are not U.S. citizens or are not registered to vote in their local areas.”

If a person legally cannot vote, getting their opinion on who their preference is for the election will tell you very little about an outcome. That leaves you with screening for registered voters or likely voters.

The problem with registered voters is that they may not vote. Gallup explains again, “Of course, we know that in the final analysis, not all of these registered voters will actually end up voting. So Gallup has over the years created systems to isolate likely voters, that group of individuals who we can estimate are most likely to actually turn out and vote.”

Likely voter polls are better.

The methodology for creating a pool of likely voters can be up for debate. But it’s still better to try and ascertain what that group of people think versus any others. If a person plans on voting, we should ask them the election polling questions to get an idea of what’s happening.

Gallup provides this helpful example:

Here’s an example. Gallup’s final poll before the 2004 election showed the following:

Registered voters: George W. Bush 46% / John Kerry 48%

Likely voters: George W. Bush 49% / John Kerry 47%

Kerry was ahead among registered voters by 2 points, while Bush was ahead among likely voters by 2 points.

If you’d relied upon registered voter-only polls, you’d have believed John Kerry held the lead. But if you took the extra step to determine who would vote, the data said Bush led.

Gallup adds this note on that example: “The final election result? Bush won the popular vote over Kerry by about 2.5 percentage points, almost exactly what our likely voter estimate predicted. Had we reported only registered voters, we would have estimated a Kerry victory. In other words, had all registered voters turned out in 2004, Kerry would have been elected president. But all registered voters didn’t turn out. There was a Republican advantage among those who did turn out. And Bush won.”

RV polls are wrong.

I make all those points to return to the upcoming midterm elections. In the RealClearPolitics average of polls, Democrats currently hold a slim lead over Democrats. If you average all the polls together, Democrats have a 0.3-point lead in the generic ballot polls.

However, if you isolate registered voter polls versus likely voter polls, the story changes. As of September 25, 2022, if you average registered voter polls, which is a majority of the pollsters, they show Democrats leading by 2.8 points in the generic ballot.

If you take the likely voter polls and average them out, Republicans hold a 2.8-point lead in the generic ballot. That’s a 5.6-point difference between those two poll types, and they’re asking the same question. It’s nearly identical to the example Gallup uses of 2004 between Bush and Kerry.

Media pushes narrative over accuracy.

Mainstream pollsters are continuing to push registered voter polls. In the RealClearPolitics average, those outfits using registered voter polls this late in the process include The Economist, Politico, NBC News, The New York Times, and Harvard. Rasmussen, Tralfagar, CBS News, Emerson, and ABC News / Washington Post are pollsters who are running likely voter polls. And we’ve only gotten likely voter polls recently from a few of those.

That tells us that the polls are wrong. Anyone running a registered voter poll right now is putting out bad information. Likely voter polls may have their own issues, but at least those pollsters are attempting to answer the question — who is going to vote in the November midterms?

Some pollsters are trying to hit the asteroid. Others aren’t even making it out of orbit. For a press allegedly obsessed with misinformation, they sure like to put it out with polls. The next time you read a polling story, ask yourself, how did they run this one?