Marking 20 years
of bold journalism,
reader supported.
News
BC Politics
BC Election 2013

Behind BC's Pollster Fail

Were they foiled by the '10-second Socred'? A look at several possibilities.

Tom Barrett 17 May 2013TheTyee.ca

Find Tyee election reporting team member and contributing editor Tom Barrett's previous Tyee articles here. Find him on Twitter or email him.

image atom
Tuesday's outcome was not great publicity for some pollsters. Photo: Shutterstock.

"Any election is like a horse race, in that you can tell more about it the next day." -- Sir John A. Macdonald

Greg Lyle has seen a lot of election campaigns -- and campaign polls -- as a pollster and a political organizer.

He says there's a key difference between parties' internal polls and the polls you read about in the media.

"Parties spend a lot of money on polling," Lyle said in an election night interview.

A major party will spend between $150,000 and $200,000 on polls during an election, he said. Public polls, the kind you read about in the media, are either sponsored by media outlets at a relatively low cost or given away free.

Pollsters tend to be political junkies. They like to be a part of the campaign drama and they want to know what's happening. That's part of the reason they give campaign polls away.

But political polls are also loss leaders for pollsters. They make their money testing public opinion for people who want to sell shampoo and potato chips; having the media talk about their election insights helps attract such clients.

So if you're a pollster, calling an election correctly is great publicity.

Having people say "How did the pollsters get it so completely, utterly, ridiculously, ludicrously wrong?" is not great publicity.

'10-second Socred' syndrome

Lyle, managing director of the Innovative Research Group, thinks the answer to that question lies partly in pollsters' methodology. Many of the big polls taken during the campaign were online polls: a pollster assembles a panel of tens or hundreds of thousands of people who are willing to answer questions, sometimes for a token fee. The pollster conducts a poll by drawing names from that panel and sending out emails with links to questionnaires.

Some experts argue that such polls pose problems. While online polling has generally been pretty successful, some, like Lyle, argue that online polls don't "respect the rules of polling, which is that everybody has a random chance, or an equal and known chance of being selected."

We don't know if the people who join online panels have the same characteristics as the general population. For one thing, one in five Canadians don't use the Internet. Those people aren't part of pollsters' online panels; does that mean the pollsters are missing something?

Perhaps pollsters missed the opinions of people who aren't fluent in English, Lyle suggested. Or maybe the pollsters weren't asking the right questions. "If someone had asked the question, 'I don't really like what the Liberals have been up to, but I think they're the best to run the government,' I think we'd find that's the question that explains what was going on."

Lyle, who was Liberal Gordon Campbell's campaign director in the 1996 B.C. election, recalled the phenomenon of the "10-second Socred" -- voters who supported B.C.'s Social Credit party only during the time they were in the polling booth.

How many people who told pollsters they were going to vote for the New Democratic party stared at their ballot Tuesday, thought about Premier Christy Clark's warnings of looming economic chaos, then put an X next to the Liberal?

The points Lyle raises all have major implications for pollsters. It's unclear how much polls influence voters, but still: how many people voted for the Green party instead of the NDP Tuesday because the polls suggested a comfortable New Democrat majority? How many NDP supporters stayed home for the same reason? How many BC Liberals were motivated to get out? How many BC Conservatives had second thoughts?

After spectacular failures in Alberta last year and B.C. Tuesday, the polling industry has some soul-searching ahead of it. As pollster Angus Reid wrote in The Globe and Mail: "Public polling has to change with the times and find the resources to dig deeper into the more diverse -- and fickle -- Canadian psyche."

Two main potential errors

There are two main ways the pollsters may have blown it. They may have misread what British Columbians were thinking because of flaws in their methodology.

Or, as Lyle's 10-second Socred comment suggests, maybe the polls were right at the time they were taken, but voters pulled off a massive, last-minute swing away from the NDP that none of the polls could catch.

Methodology first: these days, all pollsters face a huge challenge reaching respondents. The traditional method involved phoning up people at random and it worked pretty well for years. But these days, people tend to screen their calls or hang up on pollsters. Cell phones further complicate things.

To get a sample of 800 prospective voters by phone, pollsters will likely make 8,000 phone calls or more. This is expensive and time-consuming. And it raises the question of whether the people who won't talk tend to vote differently from the people who do.

So pollsters have tried other methods of reaching people. One is known as Interactive Voice Response, or "robocalling," where a computer dials numbers at random and a recorded voice asks questions. Respondents answer by pushing buttons on their phones.

Other pollsters use the online method described above. All these methods have advantages and drawbacks. IVR was the best of a bad lot in the last week, but the method has also missed the boat badly elsewhere.

Seeing as all the different methods suggested an NDP majority, something else may be at work. University of B.C. political scientist and polling expert Richard Johnston has spoken of polls' tendency to underestimate the support for incumbents in a number of recent Canadian elections. Is there a consistent bias here? Why? Certainly, it's a factor we should keep in mind in coming elections.

There may also be a problem with "completion windows" -- the time that pollsters spend in the field. In a bid to be current and avoid being tripped up by last-minute switchers, most pollsters were polling over one or two days and releasing the results immediately, often on the same day they were in the field.

Choosing a completion window is a fine balancing act. If you take too long in the field, a late shift in opinion will get buried in your earlier data. But not everyone responds to pollsters right away, be they on the phone or online. The people who respond right away may have different voting habits than the people you missed on the first round.

If you don't make an effort to get the opinions of those non-responders, you might miss something. Did the pollsters' desire to be current cause them to miss some Liberal voters?

The secret life of voters

Some in the business think there are problems with the questions pollsters ask. If you're asked, "Who would you vote for if an election were held today?" you might say, "the Green party," even though there's no Green candidate in your riding.

The pollster may give you a list of party names and ask which one you would vote for. You might reply NDP. Then, when you get into the little cardboard voting booth and look at your ballot, the name you recognize is that of the Liberal candidate, who's the incumbent. The Liberal incumbent gets your vote.

Harris Decima chairman Allan Gregg told the Canadian Press that this is a problem: "Sophisticated polls conducted for political parties plug in the names of local candidates when surveying voters, Gregg said, whereas polls conducted for the media -- for free -- generally can't afford to go to such lengths."

But Angus Reid used a facsimile ballot in its last two election polls, which found seven and nine-point leads for the NDP.

A further difficulty for pollsters is predicting which of their respondents will actually vote. With turnout rates dropping to nearly 50 per cent, this becomes a vital question. If one party's supporters are significantly more likely to vote, a poll has to account for that somehow.

Party pollsters build sophisticated models to predict whether their respondents will turn out. While the survey firms whose work gets published in the media try to do the same, some party pollsters say their models are comparatively crude.

All these criticisms assume that pollsters somehow missed what people were thinking in the last days of the campaign. But what if the electorate of British Columbian went to bed Monday night with a collective intention to vote in a comfortable NDP majority?

What if there really was a horde of 10-second Liberals who confounded the pollsters by switching their votes at the last minute?

Kyle Braid, vice-president of Ipsos Public Affairs, thinks a bit of that happened.

"The reality is that one in 10 (11 per cent) B.C. voters decided in the voting booth on election day to mark their ballot for their candidate..." he wrote in a press release.

How's he know that? Well, he took a poll...  [Tyee]

  • Share:

Facts matter. Get The Tyee's in-depth journalism delivered to your inbox for free

Tyee Commenting Guidelines

Comments that violate guidelines risk being deleted, and violations may result in a temporary or permanent user ban. Maintain the spirit of good conversation to stay in the discussion.
*Please note The Tyee is not a forum for spreading misinformation about COVID-19, denying its existence or minimizing its risk to public health.

Do:

  • Be thoughtful about how your words may affect the communities you are addressing. Language matters
  • Challenge arguments, not commenters
  • Flag trolls and guideline violations
  • Treat all with respect and curiosity, learn from differences of opinion
  • Verify facts, debunk rumours, point out logical fallacies
  • Add context and background
  • Note typos and reporting blind spots
  • Stay on topic

Do not:

  • Use sexist, classist, racist, homophobic or transphobic language
  • Ridicule, misgender, bully, threaten, name call, troll or wish harm on others
  • Personally attack authors or contributors
  • Spread misinformation or perpetuate conspiracies
  • Libel, defame or publish falsehoods
  • Attempt to guess other commenters’ real-life identities
  • Post links without providing context

LATEST STORIES

The Barometer

Do You Think Trudeau Will Survive the Next Election?

Take this week's poll