Polls and Prediction Markets

attended on night the launch of “Key to Victory” which is the normal post election campaign review book edited by Stephen Levine and .

I find these books so fascinating, I was even reading it during the speeches!

Bryce Edwards has reviewed (h/t iPredict Blog) a chapter by Shaun McGirr and Rob Salmond on what sources of information best predicted the election outcome. Was it an individual poll, the iPredict markets or the of polls.

The amount invested in iPredict was considerable:

  • $64,500 was traded over the likely nature of ‘the Maori Party's post-election relationship with National'
  • $25,800 was traded over the Wellington Central battle between Grant Robertson and Stephen Franks
  • $132,100 was traded over whether ‘there will be a National prime minister after the 2008 election'
  • $413,000 in total was invested in election-related predictions

And how did iPredict do”

So, how accurate was iPredict in 2008? McGirr and Salmond conclude that although iPredict overestimated the eventual support for both Labour and National, it was more accurate any individual polling company.

And the individual polls:

In reality in 2008, McGirr and Salmond found to be the case – with Colmar Brunton and DigiPoll exaggerating public support for National, and exaggerating support for Labour (p.264).

So which polling companies were most accurate and inaccurate? McGirr and Salmond say that TV3's TNS poll was the best (as it was in 2005 as well), and Fairfax's Neilson pool was the poorest.

The TV3 poll is the one that currently shows a 27% gap! Mind you they are now with Reid Research, so there may be a different methodology used now.

Then they look at the polls of polls published by three outlets – NZPA, Rob (at 08 Wire) and myself (at curiablog).

In addition to the five opinion polls, some observers attempted to average out the idiosyncratic errors of the individual polls by aggregating them into a “poll-of-polls” using different methods. The New Zealand Press Association simply took the average of the estimates of the six most recent polls, while The New Zealand Herald took the average of the last four polls. Two blog-based polls-of-polls – one run by David Farrar of New Zealand's premier political blog Kiwiblog, and one hosted at a smaller blog [run by author Rob Salmond] called 08wire – weighted more recent polls with larger sample sizes more heavily (p.257).

And how did the poll of polls do?

McGirr and Salmond say that ‘Poll-of-polls consistently performed well during the 2008 campaign, outperforming most of the opinion polls and the prediction markets' (p.270). They therefore advocate that both the media and public should pay much more attention to this highly accurate source of political information.

Tis has prompted me to update the poll of polls widget, which is below.

Salmond ranks the different outlets for their accuracy to the final result. In order they were:

  1. NZ Herald poll of polls 6.1 (error from result)
  2. NZPA poll of polls 6.8
  3. Curiablog poll of polls 8.1
  4. TV3/TNS poll 9.6
  5. 08 wire poll of polls 13.6
  6. iPredict 15.7
  7. TVNZ/Colmar Brunton poll 16.8
  8. NZ Herald/Digipoll poll 19.8
  9. Roy Morgan poll 20.8
  10. Fairfax/Neilsen poll 29.6
  11. NZ Political Stockmarket 109.5

The NZ Political Stockmarket used virtual money, so it shows what a difference real money can make.

The authors conclude that media outlets should not just report the individual poll results when they commission a poll, but also publish regular info on a poll of polls and on iPredict.

Incidentally I will probably review and tweak the curiablog methodology a bit when I have some spare time.