Ballot box voting poll station
Ballot box voting poll stationiStock

Election polls.

The public seems to be both love and hate them – often simultaneously.

Politicians, political scientists, and journalists obsess over them, along with a growing number of voters.

Parties seem to rise and fall as result of recent polling, with voters fleeing from parties which have fallen under the dreaded electoral threshold, or surging to reinforce larger parties, like the Likud, which are struggling to win a plurality.

A case in point is Yisrael Beytenu. Just weeks ago, the party was safely above the 3.25% minimum threshold in every poll, winning 5 to 6 six seats in most surveys. Once it fell below the threshold in a poll, however, raising doubts about its ability to enter the next Knesset, voters fled in droves, leaving Yisrael Beytenu with as little as a single seat’s-worth of votes.

The Zehut party, which is hovering right around the threshold in recent polls, launched an ad campaign against the fixation with poll numbers, condemning the phenomenon as “Hasima-phobia” or “Threshold-phobia”.

‘The polls were wrong about Trump and they’re wrong now’

Alongside the obsession with polling comes a certain resentment and skepticism among many voters – even those who can’t help but dive into every new survey, hoping their favored party or candidate has ticked upwards.

To some extent, the skepticism is natural. Professionally conducted polls, which we are constantly reminded are “scientific”, are often heralded crystal balls, giving us a glimpse into the future. One of the more popular American election analysts, University of Virginia political scientist Larry Sabato, even named his online election political newsletter “The Crystal Ball”.

Following President Trump’s 2016 upset victory, however, poll-bashing has become a regular pastime, particularly in some quarters of the Right.

The polls all showed Hillary Clinton winning by a landslide, the argument goes, and every major media outlet’s election center include The Washington Post, New York Times, Huffington Post, and CNN said Trump would lose by a landslide.

In an interview published by Maariv last Friday, Yisrael Beytenu chief Avidgor Liberman tapped into this skepticism – bordering on sheer disbelief – in response to his own party’s collapse in recent surveys.

“I see the [polls] as a form of psychological warfare, manipulation. Look, on November 7th 2016, all the media outlets – CNN, Politico, Washington Post, New York Times – all said without exception the prediction: Hillary Clinton would be the president of the US. And on November 8th they got Trump. In 2015, Boogie [Itzhak] Herzog was predicted to become the prime minister, and we all know how that turned out.”

‘Skewed Polls’

The insistence by some on the Right that election polls are unreliable – either because of pollster bias or some systematic flaw in the system – is nothing new.

In the 1980s and 1990s, some analysts and pundits argued that unexpected wins for the GOP in the US, and the Conservative Party in the UK, were the result of respondents hiding their true voting intentions during telephone surveys, saying they would vote for the choice they perceived was “politically correct”, despite having no intention of actually doing so.

The alleged phenomenon was dubbed the “Shy Tory” factor in the UK, and the “Bradley Effect” in the US, after failed California gubernatorial candidate Thomas Bradley.

Whatever merits of the theory had in an age of live interview telephone polls, the concept of the Shy Tory has had a powerful and enduring effect on the willingness of some to accept any poll, even when the shyness factor has been diminished as pollsters transition to robo-calls and internet polls.

In an age of increasing polarization – and politicization – of news reporting, it’s hardly surprising that some of the over-the-top predictions made by some political analysts have made the public jaded about the sometimes fervent belief in poll results. After all, the Huffington Post claiming on election day that Clinton had a more than 99% chance of winning – only to have her lose – inevitably makes political expertise look unreliable.

That being said, the criticism of the polls themselves – not the analyses done of them by third parties – has been exaggerated, and in many instances, downright false. In addition, claims that polls systematically lean to the left ignore elections where the opposite is true.

Before the HuffPo’s “99%” debacle in 2016, there were the “skewed polls” of 2012.

A whole slew of pundits, election analysts, and even candidates themselves, bought into the theory that Republicans were being systematically undercounted in presidential polls.

Mitt Romney wasn’t losing to Barack Obama – no, he was actually winning, but the polls were just under sampling Romney’s base. In other words, the polls were “skewed” towards the Democrats, showing Democratic turnout many Republicans were skeptical could be real. Black turnout would have to exceed white turnout, they said, something which had never happened before.

One fervent believer in the “skewed polls” theory, Dean Chambers, put up a website which “unskewed” each poll, reweighting the results by altering the turnout models in the belief that the electorate would look more like that of 2004 than 2008 – since, the argument went, 2008 was a banner year for Democrats only because of the collapsing economy and George W. Bush’s record unpopularity.

Former Clinton aide-turned-conservative pundit Dick Morris famously predicted Romney would trump Obama with 325 electoral votes to 213. When the election actually went in Obama’s favor, 332 to 206, Morris’ blunder cost him his position as Fox News contributor.

But it wasn’t only Morris who bought into the “skewed polls” theory. George Will predicted a Romney landslide, as did the much-respected political historian Michael Barone.

Veteran presidential campaign advisor Karl Rove, who helped engineer Bush’s two White House wins, refused to believe that his own prediction of a Romney win had turned out to be based on false assumptions, arguing on air election night when Fox News called Ohio for Obama that the decision was wrong.

Even Romney himself, and much of his inner circle, were shocked by the defeat, believing that the polls really were off.

And in reality, the polls were off – to an extent.

Obama ended up defeating Romney by almost four points, 51.1% to 47.2%, with 332 electors to 206. But the average polls going into election day painted a different picture, showing a veritable dead-heat in the popular vote and electoral college. On election day, the final RealClearPolitics average of polls well within the margin of error, with Obama leading by just seven-tenths of a percent – 48.8% to 48.1%.

Of the 538 electoral votes, Obama had 201 leaning in his direction, compared to 191 leaning towards Romney, with 146 tossups. In other words, the election was as close as it gets. Except that it wasn’t.

Poll Myths: Brexit, Marine Le Pen, and the Trump victory

Obama not only won the popular vote by a significantly wider margin than predicted, he won every single battleground state besides North Carolina. The polls, it was later found, had in fact been skewed – in Romney’s favor.

Black voters did turnout at a higher rate than whites for the first time ever, and pollsters undersampled cellphone users, missing younger voters who didn’t have landlines.

But what about the Brexit referendum vote in the UK in June 2016, and President Trump’s presidential win that November – both of which were missed by the polls?

These two votes, often cited by poll skeptics, were in fact missed by most of the polls – kind of.

From the outset of the campaign in mid-2015, Brexit polls showed support for “Leave” trailing heavily behind “Remain”. But the polls also showed that a large portion of the electorate – close to a fifth of voters – were undecided for most of the campaign.

As the number of undecideds fell, Leave surged, actually surpassing Remain in mid-June 2016, as the referendum neared.

After the fatal shooting of a pro-Remain Labour MP by a white supremacist just days before the vote, however, the polls shifted, giving momentum to the Remain campaign. But the lead taken by Remain was narrow and ephemeral – and polling aggregates took this into account, suggesting the race would be a nail-biter. Unlike its presidential prediction, the Huffington Post said the vote would be close, giving Remain just a half-point lead, 45.8% to 45.3%, with nearly 9% undecided. In the end, Leave won by 3.78 points.

Five months later, Trump’s upset victory seemed to furnish further proof that polls are simply wrong.

But as with Brexit, it is a bit more complicated than that.

Actually, the national polling in the 2016 race was very accurate – especially when you look at the average of polls.

The final RealClearPolitics average of polls gave Clinton a 3.2-point lead in the popular vote, just slightly over the 2.1-point margin she actually won the popular vote by.

Even the state polling averages weren’t bad – by and large. With the exception of Wisconsin, states that were really competitive were spotted as such in the polls. Even the states where the averages were off, in most cases, they were well-within the margin of error.

In essence, Trump pulled off a very unusual win, narrowly carrying three states which polls suggested leaned Democrat, while losing the national vote. That doesn’t mean there weren’t bad polls – the polling in Wisconsin simply failed to pick up on Trump’s support there. But it’s also not surprising that where more polls were conducted – say, in the national vote – the results were accurate, and where few polls were conducted – Wisconsin – the results were less accurate.

Rather than take the Brexit and 2016 US presidential election as cautionary tales against ignoring the margin of error, some have used the two cases as proof that polls at large can – and should be simply ignored entirely.

Thus, like with Mitt Romney in 2012, there was a certain refusal in France’s presidential election in 2017 to accept polls, even leading to claims of “Shy Front National” supporters who were not counted in surveys but would turn out en masse on election day.

In reality, not only did the liberal Emmanuel Macron win by a massive 2-1 landslide against the right-wing populist Marine Le Pen – polls actually over-estimated Le Pen’s support. Le Pen ended up winning just 33.9% of the vote to Macron’s 66.1%, but had been polling at an average of 38.2% to Macron’s 61.8%.

As in the US election in 2012, the polls had been off – in the opposite direction critics had suggested.

But what about Israeli election polling? What is their track record for predicting winners and losers?

Part two of this analysis, focusing on the history of Israeli election polling, can be found here.