Election polls are everywhere. In news articles, on social media and maybe even in your group chat, you’re probably seeing loads of numbers, some of which seem to contradict one another.
One poll says Vice President Kamala Harris is up, another shows former President Donald J. Trump in the lead, and another says 6 percent of Americans think it’s acceptable to put up Halloween decorations before Labor Day.
So let’s talk about how to read polls like a pro. Here, we’ll go over the basics of what to consider. In future installments, we’ll dive into the places polls can go wrong, and what it is that pre-election polls can really tell you about the race.
The golden rule: Never put too much stock in a single poll. Remember that polls are a snapshot of how people were feeling at a particular moment in time; they are subject to error, and are best understood in aggregate, such as through New York Times polling averages.
But when you do want to understand more about a single poll, consider the so-called P.S.T. No, not Pacific Standard Time — take a look at the pollster that conducted the poll, the sample population surveyed, and the time frame in which it was conducted.
The pollsters: Not all are created equal
Some polling firms have better track records then others, and are more transparent about their methods. Others are new, or politically partisan. Knowing the difference can help you understand the poll in the right context.
“The first thing I think about is: Have I heard about this pollster before, and if not, can I figure out something about where they’re coming from, what their perspective is?” said Joshua Clinton, a political science professor at Vanderbilt and the co-director of the Vanderbilt Poll. “You hope that they’re not just like two kids in high school in Pennsylvania.”
He wasn’t just making that up: Patriot Polling is run by two college students, who started the firm when they were in high school. Which is not to say you should disregard polls run by young or new pollsters; it’s just helpful to recognize that they don’t have a long track record.
If you’re unfamiliar with a pollster, you can see how they’ve been rated by third-party groups such as the polling website FiveThirtyEight, which has a list of pollster ratings based on their transparency and accuracy. The election forecaster Nate Silver also maintains a list, and The Times provides the latest polls from select pollsters that have met certain criteria for reliability.
You can also look at whether the pollster has published a detailed methodology. Even if you don’t digest it all (methodologies can be a bit technical), the fact that it’s publicly available is a good sign; many less reliable pollsters simply don’t reveal much about their methods.
Take any polls released by political campaigns with a hefty grain of salt. They can be accurate, and you don’t need to disregard them entirely, but campaigns will make public only the polls they want you to see.
The sample: Who was polled is as important as what they said
Once you’ve determined who conducted the poll, consider who was surveyed. Start by taking a look at the number of people surveyed; you can sometimes find this information in a news article about the poll, but if not, look for links in the article to a press release from the pollster.
A general rule of thumb is that a national poll with 800 to 1,000 respondents provides a decent level of confidence that the sample is representative, though state-level polls can have smaller sample sizes and still be statistically sound.
“If you see a survey of just 100 or 200 people, you’re talking about a double-digit margin of error,” said Ashley Koning, director of the Eagleton Center for Public Interest Polling at Rutgers. “You’re talking about much lower statistical confidence.”
You’ll also see the kinds of people included in the survey: American adults, registered voters, or, more often at this point the process, likely voters. The latter can signal that the pollster is trying to focus on respondents who will actually vote (though how best to do this is a matter of debate).
If you read the methodology (often included at the bottom of news articles, or linked to in the press release), you can learn about how the poll respondents were reached. Some pollsters use live interviewers over cellphones and landlines, others use automated phone surveys (sometimes called robopolls), and some recruit respondents to take an online poll via text message.
All of these can be reasonable strategies, but you might be more wary of polls that rely entirely on online opt-in panels, in which anyone can click on a link and contribute responses; this means that the poll isn’t a random sample of the population. Such polls can be hard to spot based on methodology statements, because pollsters are often vague about the composition of their “panels,” but they have a below-average track record for accuracy in recent years.
Also: Consider what the pollster actually asked of the survey sample. You can usually find the exact wording of the questions in a poll’s so-called topline data, often linked to in the press release.
Different ways of wording a question can lead to different responses, and can sometimes result in a respondent thinking about things differently. Imagine, for example, how asking respondents if they are “pro-life” compares with asking if they are “anti-abortion rights.” Consider how you might respond to the question the way the pollster phrased it, and keep it in mind when looking at the results.
The time frame: Don’t forget to check the calendar
As we hurtle through the final weeks of the election, new poll results are released daily. But it’s helpful to look at when a poll was actually conducted, not just when it was released.
This can help you put responses in the context of news events as they have unfolded. For example, was the poll fielded before or after the second apparent assassination attempt against Mr. Trump?
Also, look at how many days the pollster spent contacting respondents. Very short turnaround times can mean pollsters had to compromise on whom they were able to reach.
The golden rule: When in doubt, average it out
If this feels like a lot of homework, it is. Digging this deeply into every poll you come across would be prohibitive. “The hard part is it’s just finding the time to kind of play that detective role,” Professor Koning said. “And frankly, a lot of that work is kind of being done for people.”
So when in doubt, or short on time, refer back to the golden rule, and consider what the polls are saying in aggregate. You can compare multiple polling aggregators (like The Times, FiveThirtyEight and Silver Bulletin). While taking the time to investigate a single poll can be informative, you’ll get a better understanding of where the race stands from looking at the broader picture.
The post Three Things to Look for in Any Election Poll appeared first on New York Times.