Let's not make this same mistake in future elections.

They got it wrong.

No matter how you feel about the candidates in last month’s presidential election, the real loser was the political polling industry.

Why? Because they went about it all wrong.

 

For the last several months, many of us here at Bovitz regularly tuned into FiveThirtyEight, the brainchild of Nate Silver, who is generally regarded as one of the sharpest political polling analysts in the industry.  Silver and his team would conduct extensive analyses of the numerous polls that participate in presidential election polling and churn out the latest forecasts and developments on the election.

But they got it wrong this time. Along with pretty much everyone else.

 

And now we—along with pretty much everyone else—are trying to figure out why. So first, we took a few steps back and started with what the polls were actually telling us from week to week: preference. Just a simple indication of preference: “If the election were held today, which candidate would you vote for?”

When the numbers would change, we’d speculate that Candidate X was getting the traditional post-convention bump, or that Candidate Y’s debate performance was now reflected in the new numbers.  But even through all of this, we still only had one metric to indicate how the election might turn out: preference.

We relied on that metric for months to tell us who would win the election, while assuming we understood the reasons this measure was constantly changing. But nobody actually investigated the reasons behind the changes—or even the outcome prediction itself. And yet, we know that each major campaign regularly conducted qualitative research to develop and refine their candidate’s message on each issue in the election. So that begs the question:

Why wasn’t more comprehensive research done all these months?

 

Would we have been as surprised as we were—as wrong as we were—if we truly understood the feelings and attitudes of the various population subgroups that impacted the election?

 

Isn’t “voter turnout” just sweeping the true issues under the rug?

 

 

 

This is essentially the same question we ask ourselves when conducting research among consumers:

Why are people behaving this way?

Because if there’s one thing we’ve learned from our many years of studying people, it’s that understanding the why behind something is far more predictive than measuring the what.

 

Numbers on a page are just that: numbers. They may give you some short-term indication of how people will behave, but until you decide to question and understand why the numbers are the way they are, you really haven’t learned anything. Too often, companies get wrapped up in thinking that what matters is how people feel about them—their brand, their product, their reputation, etc., in these simple measures of preference and ratings. But what they’re forgetting is that people care far more about themselves than they do about anything else. So, you have to care about them, too.

Similarly, as much as the news media and the pundits try to suggest otherwise, the election results had very little to do with the candidates. And they had everything to do with the people voting for them. People voted based on how each of the candidates aligned with their own deeply-held values and beliefs. Just look at how the election results have unleashed these very emotions in people like never before; the joy, anger, fear, despair, and hope that initially cast their ballots have now, finally, come out into the open.

 

But these emotions, these reasons for voting—they were really there all along. We just weren’t looking for them.

Acting on emotions, values, and beliefs is how we make all decisions in our lives, whether it’s who to vote for in an election, where to live, or which brand of formula to buy for our child. And again, those decisions have very little to do with you, and everything to do with people.

So, at Bovitz, we choose to focus on why; we choose to focus on people. Our job as researchers shouldn’t be to seek validation for the products and services our clients have created. Rather, it should be to understand the situations and considerations that people have that will impact how they make decisions about our clients’ industries.

And if we do our job right, we shouldn’t have to talk to people about you, because we already know enough about them.

 

And that’s what the political polling industry is missing: an understanding of people, not just preference. So, what does all this mean for predictors of future elections? It means they too need to start putting people first.