In the end, the referendum result on Scottish independence was not as close as the polls made it out to be. Granted, it was still a relatively near-run thing. But the 55% to 45% result was not the 52% to 48% the polls suggested would be the case. Why?
Put away your pitchforks for a moment, and let's look at what might have occurred. Undoubtedly, the pollsters that were in the field in Scotland will be doing some work of their own. It will be interesting to see what they come up with in the coming days.
But first, let's look at what the polls said would happen.
If we do a simple average of all the polls that were in the field to September 16 or 17, we see that, after the removal of undecideds, the Yes side was expected to take 47.8% of the vote, with the No side at 52.2%. The Yes side was thus over-estimated by 3.1 points, and the No side under-estimated by the same amount.
By that measure, it was a bit of a miss. All but one of these polls had the Yes side at either 47% or 48% after the removal of undecideds. Missing by two or three points is not a horrible result, but all of the polls missed in the same direction. This suggests it was not a question of bad polls exactly - if they were simply done incompetently, some should have had the Yes side lower than 45%. Instead, there was an issue with what people were telling the pollsters.
The average support for the Yes side before the removal of the undecideds was 44.1%, or 0.6 points lower than the result. In fact, virtually all of the polls had the raw Yes support at or lower than 45%. Could it be that the undecideds, who averaged 7.4%, swung to the No side?
If that was the case, it means that roughly 90% of undecideds voted No, with just 10% or so voting Yes. That is a rather big number. In 1995, support for the Yes averaged 47% in the final polls with support for No at 42%. Portioning out the 11% of undecideds suggests that about 20% of them would have voted Yes, and 80% of them No. So perhaps the lopsided result in Scotland is not outlandish.
But the undecideds may have just not voted. And an exit poll done by Lord Ashcroft suggests that the last-minute deciders swung to the Yes side, not the No side. His poll suggested that 15% of Yes supporters made up their minds in the last few days of the campaign or on voting day itself, compared to 6% of No voters. This means that roughly two-in-three voters who decided how to vote at the last moment swung to the Yes side.
Assuming that is accurate, that blows the theory that it was a disproportionate swing to the No side among undecideds that messed the polls up.
Another finding from the Ashcroft poll, however, points to something else.
The poll found that 14% of No voters would be 'reluctant in any way to tell your friends, family, or colleagues how you voted', compared to 11% of Yes voters. That is not a huge difference, but it could explain some of the error if we assume that people who would be reluctant to tell their friends how they voted would be reluctant to tell pollsters how they would vote.
Turnout might be another contributor to the miss.
In the four councils that voted for independence, turnout was 79.1% (Glasgow was the main culprit). In the remaining 28 councils that voted against independence, turnout was 86%. If we use that as a rough measure of how much more likely unionists were to vote than secessionists, then the polls would have instead had the Yes at 45.7% and the No at 54.3% - a much closer prognostication than their actual tally.
Put these two factors together, and virtually all the error is accounted for. It means a combination of being less likely to vote on the Yes side, and being more likely not to reveal voting intentions on the No side. So the question then is - was it ever going to be close?