Stuck in the middle
The results of tonight’s Electoral College vote have stunned almost all observers outside the USA, with many previously expecting the Democratic Nominee to sweep into the White House. The big question is this – why was the data so different from the outcome? Almost all were calling for an almost-confirmed Democratic win and yet, the reality of today’s outcome is vastly different. And this has lessons which anyone who works with data and insights can learn from:
- Don’t accept what the data is saying at face-value. Question its veracity
- Investigate the data trail – understand how the data was produced, what were the collection and analysis methods and try to look at assumptions or gaps in the process
- Validate the data with other sources or forms of insight-gathering
Let’s expand a bit more on those, shall we?
Investigating the Data Collection
Let’s look at the data collection. According to Pew Research, election polling is simply a matter of “just ask them who they are going to vote for on Election Day.” However, answers, whilst straightforward, may not necessarily reflect the final outcome, because of the context of which the questionnaire is structured can influence the answer being given at a particular point in time. In addition, not all the responses can be taken as final, given that people can change their minds.
The polling process is similar to the process which marketing focus groups use to extract insights from interviewees. And that’s where I see the gap. The challenge with focus groups and moderated questionnaires is that people may give different answers in order to be seen as politically correct. For example, the Pew Research site cites that “a pattern of polling errors during the 1980s and 1990s in elections involving African-American candidates raised the question of whether some people are reluctant to say that they are voting against a black candidate.” This all comes together to possibly create a bias in the results of any focus group.
Ethnographic Interviewing and Jobs-to-be-Done
That’s why I’m a big advocate of investigating customer needs using ethnographic interviewing techniques such as Jobs-to-be-Done (JTBD). JTBD interviews differ from focus groups because they dive into the emotions which impact a decision. For example, instead of asking “Why did you buy product or service X”, the questionnaire attempts to understand the history behind the purchase, and paint a fuller picture about the problem. In that way, we can unearth hidden drivers behind any decision, which might be completely opposite to the straight-up answer.
Data and Insights, in Context
So what learnings can we take away from today’s episode? How can we apply them to the marketing world? Whilst I still hold to the fact that data can unearth important insights (yes, this article isn’t about bashing the accuracy of data), it has to be tempered with qualitative information and context. That is to say, don’t rely on data or focus groups alone, to reach a conclusion about what your customer or audience is thinking. You might be placing an anchor on shaky ground.
Personally, this whole episode has taught me that dealing with data requires one to take a very critical view of the numbers and to always question. Numbers might tell one side of the story, but it might not be a real reflection of the larger truth.