Now the dust is settling after an eventful night. And, having patiently put up with our musing throughout the campaign, it is important that we now set out to you what happened.
Before we get into the meat of the argument, we do need to establish one thing. Voting intention polls, the kind of which ComRes has carried out for ITV News and Daily Mail during the campaign, are very different from the Exit Poll. There was only one Exit Poll (which was not conducted by ComRes) for which all interviewing takes place on the day, with interviewers standing outside polling stations asking people how they voted as they leave. Exit Polls are solely designed to predict the number of seats each party has won by asking how people have voted and then calculating probabilities for the winners.
In contrast national voting Intention polls don’t predict seats; rather they are designed to give a snapshot of how people would vote in a general election across the whole country. ComRes did not do any seat projections and our final poll published on the evening before election day found the following vote shares: Conservatives 35%, Labour 34%, Lib Dems 9%, UKIP 12%, Green 4%.
Now, there has been a degree of criticism of polling firms over the course of the night and there will be plenty more column inches devoted to the polls in the days to come. We have always been open and honest about the challenges facing us as pollsters at this and every other election. We continually review how to fine tune our methods to ensure we can achieve the granularity we desire and you expect. Polling organisations continually need to review and where necessary adapt to what is happening out in the country. For example, in October of 2014 we took the decision to list UKIP as a separate choice in our voting intention polls at a time when it was not fashionable to do so. We are humble enough to say we will review and make any adjustments necessary.
However, we also need to recognise when there is misunderstanding of what polls are intended to deliver and address misconceptions of what pollsters do.
It is now clear that the Conservatives outperformed all expectations. Much of the media narrative before - and during – the campaign has been about the contest being ‘too close to call’ (to quote the Guardian front page headline on 7th May).
However, ComRes went against the grain early on in the campaign and on 14th April we published a Pollwatch bulletin with the headline “Look past the poll of polls, the Conservatives have been leading all year”. We made the point that in all our telephone polls throughout 2015 we had not had Labour in the lead. Indeed, we can now take a look at the entirety of our telephone poll series in 2015 (for the Daily Mail and ITV News during the campaign period) and the early trend we spotted remained true. The Conservatives were never behind Labour in a ComRes telephone poll in 2015. For all those warning of “late swing” to the Conservatives, we believed the crossover in the polls happened at the start of the year. Though that doesn’t rule out further late swing of course.
This election, with its particular foibles, highlights the challenge for all pollsters. “Known unknowns” became a watchword throughout GE2015. The landscape in Scotland changed drastically after the September referendum and early signs made clear that the SNP was gaining support. The challenge was always going to be work out first just how solid that SNP surge would be, and secondly, how to accurately capture what was happening in Scotland - which accounts for around 9% of the population but 59 seats in the House of Commons.
The impact of the Liberal Democrats entering Coalition and effectively giving up their “third party status”, as well as the rise of UKIP, provided us pollsters with extra headaches. With this fracturing of the country it was always going to be important to understand the shape of the race in the key battlegrounds.
Our hunch that we were seeing a patchwork of regional contests was vindicated by a series of “battleground” polls for ITV News, focusing specifically on four key elections-within-the-election: Labour’s 40 held Scottish seats, the Liberal Democrats’ 14 held seats in their South West heartlands, UKIP target seats in the South East and the 50 Conservative-Labour marginals.
These battleground polls were crucial in understanding what was happening around the country. The South West poll suggested that the 14 Lib Dems seats where the Tories were second could all be lost, only to be described by Nick Clegg as “baloney”. Nigel Farage followed suit by describing our poll suggesting UKIP were struggling to win in some of their target seats as “terrible”. Both of those polls now look to have uncovered the true story lying beneath some of the national polls.
We will go back and work out what happened and make any necessary adjustments, just as we do after every election.
But, before we get too carried away, there are many things the polls got right: the collapse of the Liberal Democrats, UKIP’s burgeoning vote share and the SNP’s dominance in Scotland.
So what happens when we compare our final poll with what actually happened?
|Eve of election poll||Actual share (as of 10.00)|
As with every 1000-sample poll we conduct, these vote shares are subject to a margin of error of plus or minus 3%. So you can see that for each party’s vote share we were statistically on the button, within the margin of error at a 95% confidence interval.
Some commentators have been very quick to put the boot into pollsters for calling it wrong. The truth is that pollsters, when they stick to their knitting, measure vote share. We do indeed, together with academics and the media, need to look at how that vote share translates into House of Commons seats – that is certainly true. But there is no need to throw the baby out with the bathwater. Most of the polls from most of the pollsters were within the margin of error. How they are interpreted and reported needs to be a matter of collective consideration.