Andrew Smith lays out some key reasons to be skeptical of the data

Monday, October 28, 2024
Close up view of voting booths

The data is seemingly everywhere. On the front page of newspapers around the country. Slowly crawling along the ticker of cable news channels. Blasted out on social media at an increasingly rapid rate. As the 2024 presidential election draws closer, political poll information has become truly ubiquitous.

But prevalence and accuracy are two entirely different things, says UNH expert Andrew Smith. And he recommends employing a healthy dose of skepticism when viewing poll results ahead of the big day.

We all “want to know what the score is,” Smith admits, and polls are an easily digestible and readily available medium. But while they continue to spark heated debate among commentators and community members alike, they aren’t necessarily painting a particularly accurate picture of the electorate.

“You have to be very cautious about the interpretation of polls,” Smith, professor of practice in political science and director of the UNH Survey Center, as well as a nationally known expert on presidential elections and political polls, says. “For a host of reasons, both technological and methodological, there’s an overabundance of surveys, and they’re essentially just click-bait for news websites – they’re cheap to run, there’s going to be a new one every day and nobody is going to hold you responsible for publishing them because there’s always another one coming out.”

The combination of those technological and methodological changes is the primary cause of that lack of reliability. As technology has evolved and the preferred survey method has shifted from telephone to web, the necessary tools and procedures to ensure accuracy have thus far been unable to keep up.

A similar shift happened in the 1960s and 70s, when the transition from in-person surveys to telephone surveys took hold. Telephone surveys were significantly less expensive and took much less time, but it took about 10 years for survey researchers to develop the methodological tools to conduct phone surveys in a sound and accurate manner.

The recent shift to web-based polling has created the same challenge. Web-based surveys are significantly cheaper – Smith says a telephone survey in New Hampshire could cost something like $75,000 to $100,000, while a web survey would be in the $5,000 to $10,000 range.

But all of these changes have happened over just the last few election cycles, and the creation of a sound methodological playbook predictably has lagged behind.

“We’re in the midst of a paradigm shift where we’re still trying to figure out best practices,” Smith says. “That’s a big problem with web-based polls – it’s probably going to take another five to 10 years to determine what those best practices are, and by that time we might have a different dynamic with different technologies out there.”

That hasn’t stopped major media outlets throughout the country from relying heavily on polls to influence their coverage. That’s because they know consumers are hungry for information about an incredibly polarizing election and polls continue to produce the kinds of response they are most interested in.

“They know if they put a poll out, it will get a lot of hits,” Smith says. “There is a definite business advantage for newspapers or media outlets to run polls – they’re trying to make money by getting people to click on poll stories.”

Technology and best practices aren’t the only impediment to accuracy. Human psychology also plays a major role in polling challenges, as there is less incentive to answer honestly in the face of potential backlash. Smith cited the example of a teenager describing weekend activities – they might tell their friends they went out and partied, but if their parents ask, they’ll probably say they stayed home and studied.

“People tend to think of survey research as a mathematical exercise, but in reality, it’s psychological,” Smith says. “How do you get people to give you an honest answer when there are so many societal pressures that make us respond less accurately?”

Smith cites the Spiral of Silence theory, which posits that perception of public opinion influences a person’s willingness to share their own thoughts. People are more hesitant to publicly share an opinion if they believe it could lead to being shut out of a preferred social group. That might mean being less likely to wear a pin or put out a lawn sign supporting a particular candidate – or to speak to a pollster about which candidate you are backing.

So, what is a voter to do in order to stay accurately informed of what is happening as the election closes in? It’s a true challenge, Smith says, though he encourages everyone to be vigilant in exploring the origins of any poll information they come across.

“As a pollster, I would just be very cautious. I would advise anyone who is really interested in polling to look at the organizations conducting the research,” Smith says, pointing to the American Association for Public Opinion Research’s Transparency Initiative, which is designed to help polling organizations “show their work.”

In the end, though, Smith says it’s generally best to take all polling info with a rather large grain of salt.

“My advice is to not pay that much attention to it,” Smith admits. “The election will be over soon, and you’ll be able to see what happens – polling isn’t going to make a difference in the election.”