2. How does polling work?
Pew Research Center
A nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world.
The general goal of a poll is to describe or explain something about a population of interest (for example, the general public or registered voters) by asking questions of a small number of people who represent that larger population.?
Conducting a poll requires three things:??
Once you’ve conducted the interviews, you have to tally up the data and summarize the results in a way that accurately reflects what your respondents told you.?
Probably the most common question we get from our readers is “How do you choose who to interview?” We have sharp readers! How a poll samples people for inclusion is the single most important factor determining how accurate the poll will be. We’ll talk about this in detail below.??
But it’s important to note that the choice of a sampling method often determines what mode of interview is best. For example, if we sample telephone numbers, the obvious mode choice is a telephone survey. If we sample addresses, we can conduct either a mail survey or an in-person survey.??
All of this is a bit of a simplification, of course, but the essence of conducting a good poll is doing the best possible job with these three things: the mode, the sample and the questionnaire.?
The mode of interview
Most of the first polls back in the 1930s and ’40s involved interviewers knocking on doors and asking questions of the people who happened to be at home. This kind of personal interviewing was a common method until the 1980s, though it’s very expensive and rarely used today in the U.S. ?
Paper-and-pencil surveys, often delivered by mail, were popular and remain a staple of the polling world today. Telephone interviewing became widespread in the 1980s and constituted much of the polling done over the following decades.??
But the rise of the internet added a new way to interview people. Today, web surveys account for the majority of all polling because they are faster and – because an interviewer is not needed – much less expensive. Typically, people are initially contacted by phone, mail or email and then invited to take a survey on the web.
Each method or mode of interview has strengths and weaknesses. Using a live interviewer can help persuade people to participate and keep the interview focused. ?
But the presence of an interviewer can also affect how people respond. Self-administered surveys, like those done on paper or online, may yield more honest and accurate answers and allow people to respond whenever it’s convenient for them. They help reduce what’s called social desirability bias, a tendency of people to answer in a way that leaves a favorable impression – for instance, saying they voted even if they didn’t.?
Regardless of mode, the goal is always for all respondents to have the same experience and answer freely and thoughtfully. One of the interesting trends in polling today is the rise of multimode surveys – polls that use more than one method of reaching people or that give those sampled the option of responding in more than one way.?
The sample
If you field a poll only among your family, friends and co-workers, you may get interesting results, but you wouldn’t be able to generalize the results to people you don’t know. The people in your social, religious and work circles are likely to be similar to each other – and to you – in ways that matter for their opinions and experiences. For that reason, a good poll needs a sample that includes lots of different kinds of people.?
The best way to get that diversity is through something called random sampling, a method that gives everybody in your population of interest an equal (or at least a known) chance of being included.?
How does that work? Typical random sampling approaches include calling a random selection of telephone numbers (including cellphone numbers) or mailing a survey to a random sample of addresses.??
Random phone numbers can be generated from a known set of area codes, exchanges and local groups of numbers that have been assigned to people or households. The U.S. Postal Service, meanwhile, maintains a list of all residential addresses in the U.S. These two approaches give nearly every American a chance to be polled.?
Random sampling isn’t perfect. Not everyone who is sampled for the survey can be contacted. And in a typical poll, most who are contacted don’t agree to be interviewed. There are also some differences between those who are contacted and participate in a poll and those who don’t. But pollsters can correct for this problem using a technique called weighting.
How weighting works
Weighting boosts the voices of people who belong to groups that are less likely to participate in polls and lowers the voices of people from groups that are more likely to take polls.??
For example, people who don’t have a college degree are less likely than college graduates to participate in polls, so those who do participate are “weighted up” to match their actual share of the population. On the other hand, people who engage in volunteer activities tend to be more likely to participate in polls, so their responses are “weighted down” to accurately reflect their share of the population in the final result.?
Pollsters rely on knowledge about the population from the U.S. census or other sources to guide their decisions on weighting. It’s important to remember that the goal of weighting is to ensure that the voices of different groups are accurately represented in a poll’s results – and that the overall result accurately represents the views of the U.S. population as a whole.??
领英推荐
If it sounds complicated, it is. It’s one of the more complex things pollsters do.?
Opt-in sampling
A lot of the polls you might see these days don’t even use random sampling. Instead, they rely on various techniques to get people online to volunteer (or opt in) to take surveys, often in exchange for small rewards like gift cards. As a group, opt-in surveys are less accurate than those that use random sampling. This is especially true for estimates for young adults, Hispanics and other minority groups.??
Opt-in polls make up a big share of the market research world and were used by a sizable share of organizations doing presidential polling in 2020.??
How is it possible for these polls to accurately represent the population? One common way is through weighting – the same process used with random samples. But in the case of opt-in surveys, we don’t have the benefit of knowing that all kinds of people and groups in the population had a chance to be included, so there’s a much greater burden on weighting to make the sample match the population. Unfortunately, some of these opt-in surveys use?very rudimentary approaches?to weighting, which are less effective in making them representative.??
A big hazard in opt-in samples: Bogus respondents
Opt-in surveys are also vulnerable to contamination from bogus respondents – people who are ineligible for the survey or who provide insincere responses.??
An improbable finding from a December 2023 opt-in survey looked like an example of this. The prominent survey found that 20% of adults under 30 strongly or somewhat agreed that “the Holocaust is a myth,” an implausible result given earlier polling on the subject. Pew Research Center repeated the question on its probability-based panel and found that only 3% of adults under 30 agreed with this inflammatory statement.?
Not only did young adults stand out in the opt-in poll, 12% of Hispanics also agreed with the controversial statement about the Holocaust. Our research suggests that some people who engage in insincere responding (such as answering “yes” to every question) are claiming identities that they believe will give them access to more surveys and rewards. That is, they say they are Hispanic or young – but aren’t really.
To help us observe this phenomenon in practice, we asked an opt-in sample to tell us whether they were licensed to drive a nuclear submarine, a qualification held by very few Americans. (While the exact number of submarine operators is unavailable, the entire U.S. Navy constitutes less than 0.2% of the adult population.) Among adults under 30 in this opt-in sample, 12% said yes. Among those claiming to be Hispanic, 24% said yes – versus 2% of non-Hispanics.
Of course, a zany question like this may inspire some silly answers. But we see similarly implausible results in opt-in surveys for more mundane questions like the receipt of certain government benefits, so the problem isn’t confined to topics like nuclear subs.
The questionnaire
The heart of a good poll is the set of questions to be asked. Seems simple, right? After all, we ask and answer questions all day long when talking with other people.??
But we also know that misunderstandings are common in human conversations. So it’s important that poll questions are relatively simple, clear and understandable by people from a wide variety of backgrounds. (Check out our five-minute?Methods 101 video on writing good survey questions.)?
It’s also important to ask questions that people can answer. While that may seem obvious, you might be surprised by how many polls ask people about topics they’ve never thought about or know little about. Polls also sometimes ask people to remember details about things they’ve done in the past that hardly anyone could be expected to recall accurately.?
Another concern is that polling questions can intentionally or unintentionally lead people to answer in a particular way, just by the way they are posed. For example, asking people whether they agree or disagree with a statement usually gets more agreement than if the statement is paired with an alternative point of view.??
A 55% majority in 1994 agreed with the statement “The best way to ensure peace is through military strength.” But posed against an alternative view – “Good diplomacy is the best way to ensure peace” – just 36% in another poll conducted at the same time endorsed “peace through military strength.”?
Don’t be too suspicious about this, though. Most public pollsters are trying to get a fair reading of people’s opinions. And as a consumer of polls, you can do a pretty good job of evaluating the fairness of questions by consulting the full text of the question wording (sometimes called a topline). Read the question aloud: If it sounds loaded to you, it probably is. And if the full text of the questionnaire isn’t made available, you should be skeptical about the poll.??
At Pew Research Center, we make all our questionnaires publicly available.?Here’s an example?(PDF).?
Throughout the process of conducting a poll, we try to be mindful that our respondents, collectively, give us hundreds of hours of their time when they read our survey questions and provide their answers. We are grateful and honored that people trust us with their views, so it’s important that we get it right when we tell the world what those views are. That’s what good pollsters do.?
Extra credit
Our Methods 101 video series explains the basic methods we use to conduct our survey research. Here’s a playlist of short videos for you to watch at your leisure.?
Feminist, grassroots political strategist, federal health policy analyst. Pro-immigrant advocate, pro-union advocate. Voracious reader.
1 个月Pew Research Center
Looking for opportunities to make a difference And bring smiles to faces featuring extensive experience in governance and compliance in Data Entry And Environmental Services.
1 个月Insightful
Chair Advisory Board Clinton Institute for American Studies UCD and President Advisory Board Glucksman Ireland House for Irish Studies NYU.
1 个月Weighting process proves that polling is highly subjective and is NOT a science, always fighting the last war, not allowing for new developments or changes in opinion.