Grumpollie on polling cellphones

August 4th, 2014 at 7:43 am by David Farrar

Grumpollie (a senior experienced NZ pollster) blogs on the issue of calling cellphones. He quotes some data to note:

In fact, you can see in the next chart that nearly half (47%) of non-landline households receive more then $40,000 per year, and a quarter (25%) receive more than $70,000 per year. So sure, absolutely, there is a skew toward lower income (and likely younger) households, but don’t assume that cell-only households all contain lower income people and young people.

Andrew breaks down the spread by income being:

  • 38% of households are under $40,000 income of which 8% do not have landlines
  • 23% of households are under $40,000 income of which 3% do not have landlines
  • 39% of households are under $40,000 income of which 4% do not have landlines

He then points out:

Most polling companies will weight their data to (EDIT: try to) correct for things like non-coverage of age and socioeconomic groups. The problem with weighting is it assumes the people you have surveyed (and are weighting) are similar to the people who you haven’t surveyed.

But here’s what you need to think about if you’re wondering how big the cellphone non-coverage issue really is:

  • Do voters in the under $40k income band who are not covered differ in party support from voters in the under $40k income band that are covered?
  • Do voters in the $40-70k income band who are not covered differ in party supportfrom voters in the $40-70k income band that are covered?
  • Do voters in the $70k+ income band who are not covered differ in party support from voters in the $70k+ income band that are covered?

When I say differ, I’m not talking differ ‘just a little’. When you factor in the size of each non-covered income group (8%, 3%, and 4%), they would have to differ massively from each covered income group, when it comes to party support, for them to make much difference at all to a total poll result.

I think it is inevitable that polling in NZ will start to include mobile phones and/or Internet panels in their sampling. But as Andrew points out,  the impact at this stage of non-inclusion is likely to be minor. One company (Roy Morgan) does include cell phones, so one interesting thing will be to see how they go in terms of accuracy at the election. In the US the company with the highest proportion of cellphones called (Gallup) had basically the worst result.

US polling

November 10th, 2012 at 11:18 am by David Farrar

An interesting analysis on Daily Kos of the most accurate individual US pollsters. Fordham’s Center for Electoral Politics and Democracy stated:

For all the ridicule directed towards pre-election polling, the final poll estimates were not far off from the actual nationwide vote shares for the two candidates,” said Dr. Panagopoulos.

On average, pre-election polls from 28 public polling organizations projected a Democratic advantage of 1.07 percentage points on Election Day, which is only about 0.63 percentage points away from the current estimate of a 1.7-point Obama margin in the national popular vote. […]

And the list of pollsters:

1. PPP (D)
1. Daily Kos/SEIU/PPP
3. YouGov
4. Ipsos/Reuters
5. Purple Strategies
6. YouGov/Economist
11. Angus-Reid
12. ABC/WP
13. Pew Research
13. Hartford Courant/UConn
15. Monmouth/SurveyUSA
15. Politico/GWU/Battleground
15. FOX News
15. Washington Times/JZ Analytics
15. Newsmax/JZ Analytics
15. American Research Group
15. Gravis Marketing
23. Democracy Corps (D)
24. Rasmussen
24. Gallup
26. NPR
27. National Journal
28. AP/GfK

But the commentary is very interesting:

Ha ha, look at Gallup way at the bottom, even below Rasmussen. But let’s focus on the positive—PPP took top honors with a two-way tie for first place. Both their tracking poll and their weekly poll for Daily Kos/SEIU ended up with the same 50-48 margin. The final result? Obama 51.1-48.9—a 2.2-point margin.

PPP is a robo-pollster that doesn’t call cell phones, which was supposedly a cardinal sin—particularly when their numbers weren’t looking so hot for Obama post-first debate. But there’s a reason we’ve worked with them the past year—because their track record is the best in the biz.

The cell phone issue is somewhat overhyped.

One last point—YouGov and Ipsos/Reuters were both internet polls. YouGov has now been pretty good two elections in a row. With cell phones becoming a bigger and bigger issue every year, it seems clear that the internet is the future of polling. I’m glad someone is figuring it out.

Internet panels are a big part of the future – if you do it right. If you do it wrong, they can be self-selecting junk.

But let’s be clear, you have to go down to number six on the list to get to someone who called cell phones. And Gallup called 50 percent cell phones and they were a laughingstock this cycle.

The final Gallup poll had Romney 48% and Obama 47%. Obama got 50.5% and Romney 48% so they were 3.5% out on Obama – just outside the margin of error for a 1,000 sample.

Cell-phone polling

October 30th, 2012 at 10:00 am by David Farrar

Dim-Post highlights a comment left on his blog an an anonymous pollster:

The problem with calling cell phones doesn’t really lie in the cost of calls. For a polling company, calling a cell phone doesn’t cost that much more than calling a landline. The problem is the complexity and cost of employing dual sampling frames when the proportion of cell phone users without a landline is still very low. If the purpose of calling cell phones is to reduce non-coverage of likely voters, then you may actually need to ‘screen out’ those you call on cell phones who also have a landline (because they are already covered by the landline sample frame).

If we assume 6% of eligible voters have cell phones and no landline, that means that 94% of the people you call on a cell phone will not be eligible to take part (again, because they are already covered by the landline sample frame). This is where the cost would really begin to build up – all those interviewer hours required just to screen people out (eek!).

This is not the only way to reduce non-coverage – but it’s actually one of the more straight forward and ‘statistically pure’ ways (ie, you can develop some sort of weighting scheme, but the more you weight, the greater the design effect (which increases the margin or error, and decreased the accuracy of a poll).

To make things more complex:

– Some people have more than one cellphone, meaning that the probability of them being called is higher, so additional weighting would need to be applied to adjust for the probability of selection (you may notice that some polls weight by household size and the number of landlines connected to a house – this is adjusting for the probability of section)

– There are a lot of cell phone numbers that are out of use, but when they are called they still go through to a voice mail. Unlike landlines (which you can ‘ping’ to test the connection), it is very difficult (ie, near impossible) to determine if there is actually an eligible person at the end of a number, so you’ve got no measure of the success rate of your sampling approach (ie, refusal rates, response rates, qualifier rates etc).

At the moment such a small proportion of New Zealanders have a cell phone with no landline that party support would need to be DRAMATICALLY different among those people for this particular type of non-coverage to influence the poll results for party vote (eg, support for Labour among cell phone only voters may need to be TWICE what it is among landline voters for the party vote result to shift by more than, say, the margin of error).

When the proportion of people with cell phones and no landline is considerably larger than it is today (like it is in some other countries), then it will definitely make sense to employ a dual sampling frame approach. In NZ though (at least in 2011) most pollsters got things pretty close to the election day result so this would suggest non-coverage of cell phone only voters isn’t a big issue just yet. If cell phone plans get cheaper, then polling approaches will probably need to change to keep up.

All comments I agree with. As time goes on it will become more of an issue in NZ, but at this stage the impact is limited. Also polling companies compensate for those without landlines by having quotas or weighting by age and income.

The Inaugural Political Polling Forum

May 7th, 2012 at 12:00 pm by David Farrar

On Wednesday evening, an inaugural meeting of the New Zealand Political Polling Forum will be held at Parliament. This is an initiative of the Association of Market Research Organisations(AMRO). I’ve been a bit involved with bringing this together. I believe that polls have a very significant impact on our democracy and elections. They can act as a performance monitor on the Government and political parties. They help voters to decide if they think a vote will be wasted. They help voters decide whether to tactically vote. They even influence if people vote at all (as Electoral Commission research has shown). So there is a strong public interest in having public polls conducted to high professional and ethical standards, and also in having them reported in a useful way. The forum should be of interest to market research professionals, politicians, media, political scientists and others who are are politically interested. The forum starts at 6 pm on Wednesday, and there are effectively three parts to it.

  1. A presentation by Assistant Professor Rob Salmond on the 2011 elections and the pollsters, but also on some current issues and global trends in polling such as diminishing landlines, Internet panel polls and the like. Rob also has some really good ideas for the media on how they can report polls in a more meaningful way.
  2. A presentation by Murray Campbell of Baseline Research on the value of having a NZ Polling Code. I’ll comment on this in more detail below.
  3. A panel Q+A with representatives of most of the major NZ political pollsters. Your chance to ask the experts.

If you are interested in attending, then let me know by end of Monday and I will see if I can get you added to the RSVP list. Technically it has closed for catering purposes but we should be able to get handle a few more people coming, if you have an interest in this area. The idea of some sort of NZ Polling Code is one I support, and it is one I hope can be formulated by both the market research and the media industries. Of course one could have the market researchers just write one unilaterally, but then it would probably be of less practical value. I hope that some of the media who attend, will show an interest in helping progress such a code. When I and others refer to a code, that doesn’t mean it necessarily has any regulatory standing, such as with the Press Council or the BSA. I prefer to see it as perhaps a “best practice statement”. The idea is that it will actually make it easier for media to report polls in a useful way.

Now the industry (AMRO and MRSNZ) do not yet have a formal stance of having a code, let alone started to discuss what might be the substance of any such code, and it isn’t the intention to try and do this on Wednesday. This is just about the start of a process. But I thought I might share some personal views on what might be some issues worth considering at the appropriate time.

  1. Any report of a poll should include the number of responses, the dates surveyed and the level of undecideds as essential elements of the poll.
  2. Reports from market researchers should clearly indicate the exact questions asked, and the order the questions were asked in.
  3. Media organisation should place a copy of the full reports on their websites, after initial publication or broadcast.
  4. The term “poll” should be used only to refer to random non self-selecting exercises, and the term “survey” for other exercises such as text in responses, or web surveys.
  5. An archive of previous poll results should be publicly maintained (maybe by AMRO?)

Now again these are just my views, and just my initial views. Others will have different views. The purpose of the inaugural forum isn’t to get into a big debate on any of the above, but to see if there is agreement that it would be in the public interest to have a code, and who wants to be involved.

Worth noting that while there is no specific code for political polling, members of MRSNZ and AMRO follow the international code of ethics developed by ESOMAR (the European Society for Opinion and Marketing Research).

We have a good range of people coming along on Wednesday, including MPs, press gallery, pollsters, Stats NZ, academics etc. It is in the State Banquet Hall, so if you are in Parliament you can just drop in. There’s even some food and drink before it starts. I’ll blog afterwards how it goes. As I said, I think it is a very worthwhile initiative.