Data is a big deal in PR and marketing: the possibilities for companies who have genuine and proven insights into their industry stand head and shoulders above their competitors.
Indeed, a significant volume of the work we do for our clients is dependent on data and research – commercially valuable information which is designed to resonate with specific, often very niche business audiences. As such we have our ears to the ground when it comes to methodology.
Case in point: right up until the day of the election, opinion polls were painting David Cameron and Ed Miliband as neck and neck in the race to Number 10.
According to the New Statesman, the polls unanimously predicted a hung parliament, and though there were variations on the distribution, not a single poll anticipated that Cameron would have enough seats to continue as Prime Minister on the 8th.
Then, at 10pm on election night, the BBC, Sky and ITV News’ joint exit poll was published, unexpectedly catapulting the Conservatives ahead of Labour by 77 seats.
The original opinion polls were wrong. Starkly so. And now, The British Polling Council has announced an inquiry, stating “The opinion polls before the election were clearly not as accurate as we would like, and the fact that all the pollsters underestimated the Conservative lead over Labour suggests that the methods that were used should be subject to careful, independent investigation.”
Some have suggested that the disparate results were due to undecided and tactical voters whose last minute decisions skewed results. Others have pinned it on so-called 'shy Tories' who refused to tell the pollsters who they were voting for. And then there are those who have pointed out a fundamental flaw in this kind of research – it cannot replicate the physical experience of voting, or the emotional and rational responses to that experience. In this context, intention and action must be separated. Though I might fully intend to vote for Party A when sitting in my living room on Monday, when I am standing in the polling station on Thursday, I find myself voting for Party Z.
Earlier this year, New Scientist magazine published an interview with the Bank of England research chief, Andy Haldane explaining the BoE’s new approach to economic research. He said, “Some of our models and ways of thinking didn’t bear the scrutiny they came under. In light of the crisis, what better time to do some more cross-fertilisation between disciplines? I have tried to fuse bits of economics with physics, epidemiology, psychology, anthropology and one or two other ologies, to try to make sense of the world.”
In our business, we’ve long seen the benefit of triangulating research methods to get the most rigorous result. We use economic modelling, market research, competitor analysis, data mash-ups and in-depth interviews as well as opinion polls to get to those vital nuggets of commercially interesting information.
Opinion polls are great at highlighting trends and reflecting back to people what their peers are thinking. But it's crucial to remember that many variables affect opinion research; how the question is phrased, in what format it is asked (face-to-face, over the phone, online) and under what circumstances it is asked.
So, don’t place any bets on what may turn out to be skewed results. You may just end up eating your hat. Or kilt.