How many pollsters does it take to screw in a light bulb? Part One

Posted on August 15, 2011

9


No, that's not Avril. It's Jana Parizkova, Member of Czech Parliament, and Pollster for the Public Affairs Party. And no, I didn't even TRY to find something provocative. It just popped up in a search on "pollsters." *

While polls and surveys may have a place, that place is most often at the bottom of birdcages.

No, sorry. I don’t really mean that — or at least, not completely. When conducted right, and analysed with care, they often have the ability to confirm something everyone already knew or, on occasion, reveal something we’d only suspected.

I do, however, have strong doubts concerning their legitimacy as a science.

Especially amusing is the exactness of their claims to accuracy (“This poll is accurate within 3.1 percentage points, 19 out of 20 times”). How do they know? Well, aside from pre-election polls (the accuracy of which are ultimately determined, and often confounded by the election itself), most polls are are pretty much confirmed by other polls — many of which come up with completely different results.

So, the short answer to, “How do they know how accurate polls are?” is, “They don’t, really.”

The long answer is, “They don’t have a clue.”

Regardless of this, or more likely, because of this, pollsters spend a lot of time and effort trying to refine this ethereal standard of accuracy by tweaking and re-evaluating their methods.

Aside from the importance of ensuring that the sampling is properly random (which I’m not even going to touch on here), one of the major problems is wording.

Lord knows, you've got to be careful how you say things. (Photo from dumb.com)

Bad wording can result in inaccurate results. The manager at one of Canada’s largest polling firms once told me that to determine the correct wording,  they conduct simultaneous surveys in which the wording is slightly different. The example he cited concerned public receptivity to raising welfare rates. In one version, respondents were asked, “Are you in favour of raising social assistance rates for people who cannot find work?” In the other version they were asked, “Are you in favour of raising social welfare rates for people who cannot find work?”

Those asked about “social assistance” answered overwhelmingly in the affirmative, those asked about “social welfare” answered overwhelmingly in the negative.

I had to agree this definitely showed the importance of words, but I was less than reassured that it had anything to do with accuracy. Which phrase, I wanted to know, was more accurate? For this particular liberal polling firm, the answer was “social assistance.” A conservative polling firm may well have come to a different conclusion.

But even identically worded polls, conducted by the same company only a few days or a week apart, often return significantly different results. Far from being cause for alarm, however, these occasions offer analysts the opportunity to gleefully analyse the “radical shift in public opinion.” Sometimes an event has occurred that may well have played a part in swinging public opinion one way or the other (sex scandals, leaked documents, money laundering), but more often than not there’s little obvious correlation between the “shift” and current events.

Of course, an apparent “shift of public opinion” may not be real to start with, but it can quickly become real if it combines with another known characteristic of polls: their ability to actually change public opinion. A natural variation between two polls that looks like an increase in support for an issue or candidate can, through media feedback, end up becoming a real increase in support for that issue or candidate.

Pre-election polls are often hailed as being surprisingly accurate in predicting election results. It’s important to realise, however, that the behaviour of respondents when answering questions on how they will vote is basically the same behaviour they will exhibit when they vote.

Things get more complicated, though, when we start asking people about behaviour that is radically different from that of ticking off a list of choices. It’s one thing to say you’re going to do something; it’s another thing to actually do it. When investigating the feasibility of a new bus route (or increasing service along an existing one), a transit company may conduct a poll asking if people would be “very likely, somewhat likely, somewhat unlikely, or very unlikely” to use it. Respondents strongly supporting public transit may be inclined to select “somewhat likely” or “very likely,” even though they have no intention of leaving their Hybrids at home. In a similar fashion, respondents who may only be “somewhat likely” to use it may select “very likely” on the belief they are increasing the odds for the new route to be put in place.

The end result is empty buses and wasted public funds.

This desire to use polls for our own ends can also affect pre-election polls. A number of Canadian socialists may indicate they’re going to vote for the the NDP, but when the time comes to vote, pragmatism takes over and they vote for a Liberal instead. It’s a long-standing joke that if an election were held between elections, the NDP would win handily. (I said it was a “long-standing” joke — not a funny one.) This can go beyond pre-election polls to the elections themselves. Nobody really believes the people of Quebec suddenly turned into NDPers a few months ago, yet they recently elected a swath of NDP MPs as a way of giving the finger to a party they were annoyed with.

(Next post: looking good for the pollster.)

———————————————–

* To celebrate electing more women to cabinet than ever before (44), the Public Affairs party decided to issue a calendar featuring some of its new MPs (being referred to as “Czechmates” in some newspapers). All 12 months can be seen on the UK Mirror site here.

Advertisements