With an election on the way, polls are likely to come thick and fast. But with people choosing to disbelieve what they don’t like, when can you believe what polls say?
What Polls Really Are
There are a number of good polling companies that issue polls on party popularity. These are usually commissioned by newspapers or other organisations because that’s good copy – especially while we are weighing up who to vote for.
Detailed polling is often unseen by the public – commissioned by political parties and campaigns to help inform their strategies. Some is published in headline form if the party or campaign thinks doing so is useful to them but broad public opinion polling tells us more.
“Most poll were pretty close to the mark in the 2019 European Elections and all of them showed Labour close the gap on the Conservatives during the 2017 Election.”
The difficulty with this is that people often choose what to believe on the basis of what they like. A poll running against “my” party can be all too easily dismissed as biased or inaccurate. This can occasionally be fair criticism, but it is often not.
So Polls Are Accurate?
Polls are accurate to an extent. Pollsters (good ones anyway, and we’ll get to bad ones shortly) will stress that a one off poll might be an outlier because statistics throw those up occasionally. So it is better to look at trends rather than imagine one poll equates to an outcome.
Indeed, even after some shocking performances by some polling companies in recent elections, most were pretty close to the mark in the 2019 European Elections and all of them showed quite accurately a dramatic closing of the gap between Labour and the Conservatives during the 2017 General Election.
“Because YouGov often shows bigger Tory leads than others, Labour supporters might naturally assume some bias. However, each poll is not biased.”
Exact figures reflect methodology. The methods used by polling companies are broadly sound. They use proven good practice based on past outcomes to weight polling and frame questions. They also run repeating polls so the same question is tested again and again to show trends. No methodology is perfect, but consistency provides value.
So while any one poll might be a bit wide of the mark, the core indications from polls are broadly reliable.
But What About Bias?
The most natural response to a poll showing something we don’t like is to assume some sort of inbuilt bias rather than accidental inaccuracy.
This can best be seen with YouGov – which accurately (and unusually) projected a hung Parliament two weeks before the 2017 General Election, but which has for some time now shown larger Conservative leads over Labour than other polls.
Because YouGov often shows bigger Tory leads, Labour supporters might naturally assume some bias. The past preference of the Tory Party itself for YouGov can add credibility to those assumptions.
“Kantar TNS finds wildly different figures to other pollsters. This does result from identifiable bad practice, not an assumption of it.”
However, each poll is not biased. The methodology might be wrong or it might be the only one that is right. What matters is that the methodology does not change poll by poll, so the outcome still shows useful information.
In effect, there is no one sitting there trying to add extra voters to one party’s column in place of another on a poll by poll by poll basis.
Some Polls Are Still Bad Though?
While polling may vary a bit, the difference in the figures published by the good pollsters are pretty similar really and trend rises and falls in one poll are often seen across other polls too.
But there are bad pollsters and they need to be treated differently.
Kantar TNS, for example, finds wildly different figures to other pollsters for a questionable reason. This week it put the Conservatives on a 14 point lead over Labour – very out of line with other pollsters.
“Whether he Brexit Party’s true new position is 11% or 16% is less important than the reliable information that it has stabilised.”
This results from identifiable bad practice, not an assumption of it. Bizarrely, Kantor only prompts for the Conservatives, Labour, and the Lib Dems on its first page. This dramatically warps its results and I can think of no reason a good pollster would do this.
By suppressing Brexit Party and Green “votes” (among others) it must know it is getting bad results – not least because voting slips at the ballot box literally put all parties on one page.
Maybe they do this in order to get extreme results and thus boost publicity. But it certainly doesn’t feel like accuracy is the motive.
So Who (or What) Can We Trust?
Good information can help inform the public. Good pollsters provide it in the form of trends.
When Johnson became Prime Minister, the Conservatives saw their polling figures boosted in all polls. Whether they went up to 32% or 38% is relatively insignificant compared to the valuable information that they went up.
Several polls lately have seen the reverse happening, which is crucial to understanding Tory tactics. While the Brexit Party slumped during the Tory bounce, it has stabilised in all polls now. Whether it’s true new position is 11% or 16% is less important than the reliable information that it has stabilised because that stability will be informing its strategy.
Likewise Labour are up in several polls too now – closing the gap on the Tories again. Whether Labour have closed the gap by 2% or 6% is less certain than the useful information that the gap has started closing as an election looms. Again, it is that trend that helps us understand Labour strategy, not the absolute figure.
So whether polls say what you like or dislike, try not to ignore them. Instead, try to focus on the trend movements – because those are what the polls are really telling you.
*A shout-out to Britain Elects, which helps share polls impartially to keep people inform.