Q&A on Polls

The Associated Press, which yesterday publicized a poll showing Obama leading McCain by a single point, today published a nice Q&A about polling practices and why polls can vary.

To me, what jumps out is the reminder that while many pollsters do adjust their results to balance demographic factors like gender and geography, many do not adjust for other factors such as changes in party identification.  In the AP poll, for example, 44 percent of the respondents apparently identified themselves as “evangelical” Christians, far above the percentage among the general population or  among those who voted in 2004.  These factors surely affect poll results even if not everyone agrees how.

Q: Don’t pollsters simply ask questions, tally the answers and report them?

A: No. After finishing their interviews — usually with about 1,000 people, sometimes more — they adjust the answers to make sure they reflect Census Bureau data on the population like gender, age, education and race. For example, if the proportion of women interviewed is smaller than their actual share of the country’s population, their answers are given more “weight” to balance that out. But some pollsters make these adjustments differently than others. And while most polling organizations including the AP do not modify the responses to reflect some recent tally of how many Democrats, Republicans and independents there are, some do.

Q: Are those the only changes made?

A: No. As Election Day nears, polling organizations like to narrow their samples to people who say they are registered voters. They often narrow them further to those they consider likely voters. That’s because in a country where barely more than half of eligible voters usually show up for presidential elections, pollsters want their polls to reflect the views of those likeliest to vote.

Q: Is that hard to do?

A: Quite hard, since no one will truly know who will vote on Election Day until that day is over. In fact, virtually every polling organization has its own way of determining who likely voters are.

Like many polling organizations, the AP asks several questions about how often people have voted in the past and how likely they are to vote this year, and those who score highest are considered likely voters.

Q: Why is this such a problem?

A: Because nobody is 100 percent sure how to do this properly. And the challenge is being compounded this year because many think Obama’s candidacy could spark higher turnout than usual from certain voters, including young voters and minorities. The question pollsters face is whether, and how, to adjust their tests for likely voters to reflect this.

In identifying likely voters, the AP does not build in an assumption of higher turnout by blacks or young voters. Pew Director Andrew Kohut says that reflecting exceptionally heavy African-American turnout in the Democratic primaries, Pew’s model of likely voters now shows blacks as 12 percent of voters, compared to 9 percent in 2004.

Underscoring the uncertainty, the Gallup Poll is using two versions of likely voters this year — a traditional one that asks about peoples’ past voting behavior and their current voting intentions; and an expanded one that only looks at how intent they are on voting this year, which would tend to include more new voters.

Q: What else might cause differences?

A: The groups pollsters randomly choose to interview are bound to differ from each other, and sometimes do significantly.

Every poll has a margin of sampling error, usually around 3 percentage points for 1,000 people. That means the results of a poll of 1,000 people should fall within 3 points of the results you would expect had the pollster instead interviewed the entire population of the U.S. But — and this is important — the results are expected to be that accurate only 95 percent of the time. That means that one time in 20, pollsters expect to interview a group whose views are not that close to the overall population’s views.

Q: Are the differences among polls this year that unusual?

A: Not wildly, but that doesn’t make them less noticeable. There’s a big difference between a race that’s tied in the AP poll, and Pew’s 14-point Obama lead. But because of each poll’s margin of error, those differences may be a bit less — or more — than meet the eye.

That’s because each poll’s margin of sampling error should really be applied to the support for each candidate, not the gap between them.

Take the AP poll, which has a margin of error of plus or minus 3.5 percentage points. Obama’s 44 percent support is likely between 48 percent and 40 percent. McCain’s 43 percent is probably between 47 percent and 39 percent.

When support for candidates is measured in ranges like that, some polls’ findings could overlap — or grow worse.

Q: Are people always willing to tell pollsters who they’re supporting for president?

A: No, and that’s another possible source of discrepancies. Some polling organizations gently prod people who initially say they’re undecided for a presidential preference, others do it more vigorously. The AP’s poll, for example, found 9 percent of likely voters were undecided, while the ABC-Post survey had 2 percent.

– Austin irs form 2106 fine

12 thoughts on “Q&A on Polls

  1. Jon Austin says:

    Still, it is fun to watch. And, this year, we’ve already got some numbers – exit surveys from early voting are showing 2-1 Democrats versus Republicans among those in line.

    – Austin

  2. EMM says:

    My gut instinct tells me it’s going to be a landslide for Obama. The tipping point is going to come from all of the new, young voters who have cell phones and aren’t necessarily part of random phone calling.

  3. Oh, I agree that it’s fun to watch. We’re all junkies!

    EMM: Are you talking about those same people that either A) don’t bother to actually go vote or B) live on college campuses away from home so often face technical difficulties in actually voting? 🙂

  4. EMM says:

    Mike: I guess I’m talking about the same kids who usually don’t vote, you’re right about that, but who seem very motivated to do so this time out. My daughter says college kids are really hard core about this election; I sense the same thing in my classes.

    But, we will see on Nov. 4.

  5. GB says:

    According to Gallup, “at this point, there is little significant difference in the propensity to vote early between the Obama supporters and the McCain supporters interviewed in the aggregated sample of all interviews conducted from last Friday through Wednesday (among whom an average of 9% said they had already voted).” If I read the article correctly, though, this means in raw numbers there are more Obama early voters than McCain early voters; it’s the percentages that are nearly equal.

    So another set of poll data to consider. Incidentally, Gallup claims to include cell-only users in its sample.


  6. K says:

    Polling…it doesn’t matter half a shit, to quote an earlier poster? What are the confidence intervals on that? Is it somewhere between .33 of a shit and .65 of a shit, at the 95th percentile?

  7. K, I’m certain that 95 percent of the time the actual election results will be more interesting than a thousand pre-election polls, give or take 30 polls. 🙂

  8. Joe Loveland says:

    People love to hate polls. But if polls were banned, I bet Americans wouldn’t be nearly as engaged in campaigns. I bet voter turnout would drop, and voters would be a little less informed about positions. Polls give us a way to know how our neighbors are digesting the debate, and that helps hold our interest and engage us. There are downsides to polls, but our democracy would be poorer without them.

Comments are closed.