The politics of polling contradicts common sense
Date Released: Wed, 18 September 2013 08:59 +0200
You have opinions and prejudices that you want everyone to take seriously? Put a number on them. That way, you too can get loads of publicity while you mislead people on politics and labour.
One sign that an election is approaching is that people and firms are lining up to publish "reports" telling us how people will vote. A sign that we are in the midst of bargaining season is the usual reports telling us how much strikes cost the economy. In both cases, the numbers don’t tell us anything worth knowing. They disguise the fact that someone is trying to get us to believe something or to believe in them.
Using numbers to push a line is not new. A decade ago, when the Department of Home Affairs wanted us to believe we were about to be swamped by foreigners, we would be fed claims about how many "illegal immigrants" were within our borders. No one who repeated these numbers bothered to ask how it was possible to count people who were not supposed to be here and so devoted much energy to not being counted. Predictably, it turned out that the numbers were based on a formula that made no sense at all.
At present, numbers are used routinely to tell us that strikes are a huge problem: media reports quote economists and employer associations announcing solemnly how many billions of rand stoppages cost the economy. Working out how much a strike cost the economy is complicated — it is not even clear that it is possible.
If you simply calculate lost production, are you allowing for the fact that firms often work in time after a strike to regain lost production? And what is a "loss to the economy"? Do the figures take into account the fact that the employer does not pay wages during a strike? And, as this is a gain for the employer but a loss for the worker, and both are in the same economy, how do we decide its effect?
These are only some of the problems raised by the calculations. But no one who makes these claims is ever asked how they arrived at them — they are reported as "fact", even though it seems likely that they are meant to scare us, not to inform us.
The election predictions aren’t designed to persuade us of anything, although they do often show strong evidence of wishful thinking. They aim to promote a company or person by showing us how clever they are.
Recently, investment banks have taken to releasing political predictions — one offered a detailed forecast of how each party would fare. Its report was published widely, but no one seems to have bothered to ask how it arrived at their figures. Did it conduct a survey? If so, when did banks acquire social-surveying skills? Does it have a team of researchers monitoring the parties? If so, do its shareholders know it is spending money on social research? If there was no research, is it not probable that its "calculations" are based on guesswork and gossip, to which a number had been attached?
Some election forecasts are based on surveys — market researchers flood us with polls telling us which way the election is likely to go. They, too, are treated as "fact" — and are often taken as a given in "informed" political discussions.
But an exercise conducted a few years ago shows that these polls usually get elections wrong — and not just by a few percentage points. They often fail to forecast the key trends. This is not surprising — working out how people will vote requires a skill that market researchers usually lack.
A recent report by Africa Check, a fact-checking service that tests claims people make in public debate, investigated a prominent polling company claim to have developed a way of testing youth opinion by polling on social media and so trashes its competitors’ "outdated" methods.
According to Africa Check, the firm does not randomly sample opinion, the standard method.
People must volunteer to answer their questions by signing on. No attempt is made to ensure that those who sign on are who they say they are — an Africa Check researcher who is white, in his late 30s and based in Gauteng signed on as a black man in his early 20s from the Northern Cape. He was not challenged.
This contradicts just about every common-sense law of surveying.
Participants select themselves and there seems to be no control on people who misrepresent themselves. These flaws render the survey about as valuable as a poll on attitudes to meat-eating conducted at the vegetarian society. And yet the company’s surveys are routinely quoted as though they provide hard information.
People who use numbers do not necessarily know more than those who do not — and they should be forced to explain how they arrived at them if they want to be taken more seriously.
If we want to know more about what is happening around us, we need to stop taking those who use numbers at face value.
Friedman is director of the Centre for the Study of Democracy.
BY STEVEN FRIEDMAN
Article Source: Business Day