What will the polls do?

Now Bill English has taken over from John Key as Prime Minister there will be a lot of interest in what the polls do.

Internal party pollsters may be busy but the public are unlikely to get much of an indication this year.

Roy Morgan may do a December poll but their polling is usually over two weeks so would span the transition from Key resigning to English taking over. At best it may give us a rough indication.

I think it will be the end of March before we get a good idea of whether the polls show any support shifts or not, by then we will have several Roy Morgan polls plus possibly Colmar Brunton and Reid Research polls for comparison.

Poll possibilities:

  • National hold or rise – I wouldn’t rule this out but it’s feasible, Labour, Greens and NZ First haven’t changed apart from their hope levels.
  • National drop a bit, but from currently high forties to mid forties which is still within recent fluctuations – I think this is most likely short term while the public ponders.
  • National drop to the low end of their recent range in the low forties – quite possible but unless English stuffs something up badly I think less likely.
  • National support crashes and Labour leaps – a lefty dream rather than likely reality.

Andrew at Grumpollie has looked at what might happen with ‘preferred Prime Minister’ based on past Colmar Brunton trends in WHY WE SHOULDN’T GET EXCITED ABOUT THE NEXT PREFERRED PM RESULT:

The next Colmar Brunton preferred Prime Minister result will generate a lot of interest – probably a few cheers from the opposition and kind folks at The Standard, and fairly cautious or defensive responses from some National Party supporters.

Here are my predictions:

  1. Bill English’s result will be lower than John Key’s current result of 36%.
  2. Andrew Little and/or Winston Peters’ results will increase, but probably not substantially (which one does or doesn’t increase could be interesting!)
  3. John Key will still feature strongly in the results for this question.
  4. Support will increase for Paula Bennett and Simon Bridges.

He also suggests what reactions will be and why people need to ‘calm the farm’ – I hope media will remain calm but doubt they will resist trying to make a sensation out of rough indicators.

There are also some interesting trend charts from Colmar.

The only certainty is that polls will continue to fluctuate – plus too much will be read into them by media and pundits with vested interests.

 

 

Why polls are wrong

Polls will always be wrong or ‘inaccurate’ because of course they aren’t elections where people vote. They try to measure approximate opinions of far smaller numbers of people prior to elections.

Ex-pollster Andrew at Grumpollie posts Why the polls get things wrong:

Folks trying to explain polls often seem to look for one or two sources of error. Even the UK polling enquiry seemed to be looking for that ‘one big thing’ and appeared to discount small issues that could have made a small difference for only some polls.

Having tinkered will polls and surveys over the last decade or so, I can confidently state that there is never one source of error. They are infinite, they often interact and amplify each other, and they can also cancel each other out

Media likes to create simple stories out of polls – a major reason the pay for polls to be done is to create exclusive stories for themselves –  so it is common for them to oversimplify problems when polls are ‘wrong’.

It’s not the job of a pollster to carry out the perfect poll. It’s their job to understand why they can’t, and to minimise sources of error as much as possible within time and budget constraints.

Companies can make a lot of money from polling and surveys under the illusion their polls are ‘professionally conducted’, when they may not be doing anything at all to systematically understand and minimise error.

The real problem is that polling is easy to do, but it’s really hard to do well. 

And impossible to do perfectly.

So media companies pay as little as possible to have imperfect polling done to generate news stories that aren’t actually news, and they try to portray their polls as not only gospel but also as accurate predictors of future elections when the polls can never deliver, except by random chance occasionally.

Some pollsters have become a part of the problem (I suspect in the US at least) but the bigger culprits are the media that misuse polls.

Polls are wrong because they can never be ‘right’, no matter how media try to fool the voters. But the voters have the last laugh each election.

Poll reaction ‘worse than usual’

Andrew from Colmar Brunton just tweeted:

Wow this week’s poll! The criticism has been much worse than usual. NEVER happens when Labour support increases.

Labour is down 4 to 28%, the first time they have dipped below 30 since the 2014 election, and National is up 3 to 50% – see One News/Colmar Brunton April 2016.

The reaction from the left, apparent on Twitter and at The Standard, ranged from disbelief to  blame, of everything from bad or corrupt polling methods, misleading or corrupt media and John Key.

Hard core Labour supporters have now had nearly eight years of post-Clark frustration and disappointment and daashed hopes.

On current performances (of the party and of leader Andrew Little) this looks unlikely to change any time soon.

Labour has faded from a major party with a widely respected leader to a struggling party with diminishing status.

They are on to their fourth leader and their latest one seems to be heading towards failure, probably hastened by this week’s lurch into dirty politics.

Except that the party seems averse to swapping leaders yet again and go through yet another upheaval, and no one appears keen to step up and take over what looks like a now poisoned chalice.

The remaining Labour supporters (and leadership) seem blind to their own fairly major shortcomings so they blame everyone and everything else on their failures.

I’ve experienced this myself over the years, especially at The Standard where attack seems to be their only way of dealing with continued failure to gain and political traction or to score significant hits on opponents.

You can get banned from the Standard for telling them they are doing their cause a disservice with their attacks on anyone deemed disloyal or in disagreement with their behaviour or their ideals.

Things are probably looking more grim than ever for Labour. So it’s not surprising to hear  that Andrew and Colmar Brunton are bearing the brunt of their anger.

The 5 stages of loss and grief:

  1. Denial
  2. Anger
  3. Bargaining
  4. Depression
  5. Acceptance

Labour supporters seem to have spent most of the last seven and a half years bouncing between 1 and 2, with a few bouts of 4.

This term they have begun to talk about some 3 with Greens and NZ First but keep falling back to 1 and 2, which is what Andrew (Grumpollie) has experienced since the poll was published.

And another Andrew, the Little one, seems to have taken a major does of 2 with his attacks in Parliament this week, so Colmar Brunton may need to prepare them for the next poll. Neither the poll nor the reaction may be pretty.

Political polling in New Zealand

Last week Andrew at Grumpollie posted his thoughts on The future of polling in New Zealand.

His latest post suggests that the future is not looking bright: Are we down to three polls in NZ?

So, DigiPoll has shut up shop, and I haven’t seen a poll out of Fairfax in a long time.

Digipoll’s website is still up but I can’t find them in the news since early January. The last Herald-Digipoll was  4-14 December 2015.

The last Fairfax-IPSOS poll that I can find is just prior to the last election, 13-17 September 2014. IPSOS is still operating in Australia but seem to have given up with New Zealand polling.

Are we down to just three polls now? (Newshub, ONE News, and Roy Morgan.)

That’s how it looks – see Opinion polling for the next New Zealand general election.

This is not good at all, if true. With less data, it’s harder to develop new methodological and analytical approaches to polling.

It’s not good for pollsters and for political junkies but I’m not sure if most people would care.

There are two other polling companies I’m aware of, Curia and UMR. The problem with them is they do ‘internal polling’ for National and Labour respectively so their polls aren’t made public.

That leads to an issue that is worth a separate post – see Polling and better democracy.

On polls and cellphones

When polls are favourable people applaud. When they are not favourable they tend to look for reasons. It’s often other reasons than a poorly performing party, like ‘unfair’ media coverage. And the lack of polling of cellphones often comes into the discussions.

Pollster ‘Andrew’ blogs:

Calling cells is not, and will never be, the magic bullet for opinion polling.

There are many aspects to getting accuracy in polling.

Rob Salmond (Labour adviser) at Polity has posted Endangered: Polls without cell phones and looks at trends away from landline phone use and to cellphone use in the US.

The US is a few years ahead of New Zealand on mobile adoption and decoupling form landlines, but I think within 5 years we will see these kinds of proportions in New Zealand. This will make current pollsters’ policies of refusing to call cell phones hugely problematic – they will cut out almost half the population. No amount to weighting can reliably undo a sampling frame that unbalanced.

This has been reposted at The Standard and has more comments there.

Andrew also comments on this at Grumpollie: Rob Salmond’s post on cell phone polling and says “I agree with Rob Salmond that within five years polling methodologies will likely change” but goes on to make some important points, including correcting a common misconception.

The company that I work for has no policy on “…refusing to call cell phones.” In fact, they do randomly dial cell phones for telephone surveys. They will also call them for the poll if a non-qualifying person in the household gives them a cell number to call.

Roy Morgan also states that it polls cellphones.

This latest New Zealand Roy Morgan Poll on voting intention was conducted by telephone – both landline and mobile telephone…

Colmar Brunton only polls landlines:

Nationwide random digit dialling of landline telephones using stratified random probability sampling to ensure the sample includes the correct proportion of people in urban and rural areas.

Reid Research, Digipoll and IPSOS don’t state (as far as I could see) whether they poll cellphones.

Andrew points to what he things is a far bigger issue than whether cellphones are polled or not:

At present my view is that, in New Zealand, non-response is a far far bigger source of error than non-coverage. If non-coverage of cell only households is such a big issue, how come most polls seem to over-state support for the Green party? And why don’t they under-state support for the Labour Party?

In New Zealand, does calling cell phones decrease non-response or increase it? Don’t underestimate the importance of this.

And he updates his post:

UPDATE: I’ve read, here and there, some comments that polls use a) published landline listings, or b) an outdated list of number banks for RDD sampling. I can categorically state that ‘a’ is absolute rubbish. None of the main media-client public polls use published listings. At the company I work for ‘b’ is also rubbish. It’s quite possible to uncover new number ranges.

For those interested, RDD works by randomly generating numbers within number banks, then connection testing them, and then re-sampling the connected numbers.

I’m sure all polling companies do what they can to be as accurate as they can. And people look for reasons other than the failings of their own parties for unfavourable poll results.

Campbell Live poll and deceitful democracy

Campbell Live is running a poll on the GCSB bill which will run until the results are announced on Monday 19th August. This coincides with a “Fill the Town Hall and Stop the GCSB Bill” campaign meeting in Auckland.

Whale Oil has had a dig at the poll – Is the Campbell Live GCSB Poll a piss-take or what? Whale points out the irony of capturing of private data in a poll about the capturing of private data.

More significantly a pollster points out obvious deficiencies in the poll – Campbell Live GCSB poll commits the ultimate sin of survey research which points out a@CampbellLiveNZ tweet:

In the next week we’re aiming for the biggest opinion poll in NZ history. Do you support the GCSB Bill? Vote here.

Andrew explains what anyone familiar with polling will know.

Here’s a really important point for anyone wanting to judge the accuracy of this poll –big numbers do not provide representative data!

Here’s why…

This poll commits the ultimate sin of survey research – it uses a self-selecting sample.

People choose to take part based on the topic. This means that the poll only represents the New Zealanders who feel strongly enough about the GCSB bill to take part in the poll.

The only way to attempt to gain a representative result is to take a random sample of New Zealanders, and to ask them the question. Preferably, the topic of the poll shouldn’t be given to people in advance, so their decision about whether to take part will not be based on the topic in question.

He then shows how self-selection with polls can “make an enormous difference”.

But it’s worse than simply self-selection.

It takes a bit more effort but it’s still possible to vote multiple times in the poll.

And there are active campaigns to distort the poll to try and achieve a particular outcome. There is a major effort to make the poll a PR device which is a distortion of democracy.

And it is likely to be promoted elsewhere.

Ironically a commenter ‘geoff’ at The Standard complains about the results being skewed by National.

Why put any merit in an online poll? Sure the GCSB bill will attract a lot of people who are legitimately against it but it is also going to attract the attention of National’s ‘Research Unit’ who, no doubt, will have a room full of people working day and night filling out the form and skewing the results.

The results will be skewed for sure, but it will impossible to tell how much and in which direction the most skewing has occurred.

And this is becoming common. And skewed results are being used in PR campaigns to try and create deceitful perceptions.

It’s a common practice now to use a protest, a poll, a petition or stack submissions to create a perception of democratic opinion. This is deceitful democracy.

Campbell Live should know better. Unless they do know what they are doing.

Andrew Little’s “nasty” accusation

Andrew Little accused David Farrar of shady practices on Twitter:

Get Farrar and Kiwiblog to promote it. Might be able to fool some of the people.

I just don’t trust your dodgy National Party focus groups and other shady practices.

Push polling the most common…

Another pollster (his blog post) commented:

That’s a nasty accusation, and no evidence was given.

Yes, it’s nasty. About as nasty as accusing a politician of destabilising his leader without providing evidence.

All the tweets at Grumpollie’s Leading poll questions.

Update: a response from DPF:

11c