Expecting More From News-Sponsored Polls

Last week, MinnPost released its inaugural public opinion poll, another step in it’s maturation as an increasingly central part of the Minnesota news landscape. I maintain polls are an important part of news coverage in a democracy, and Minnpost proved it last week when it was the first to tell the story of the public blaming Republicans, by a 2-to-1 margin, for the bitterly debated government shutdown. After months of wonky budget debate coverage, it was interesting to read about the public verdict, as measured by a random sample survey. Our little MinnPost is growing up.

But I have higher aspirations for MinnPost. In the future, I hope MinnPost polls will focus on more than just “approval,” “blame,” and “if the election were held today” questions. Goodness knows, that ground is already covered ad nauseum by the Star Tribune, Pioneer Press, Minnesota Public Radio, University of Minnesota Humphrey School, St. Cloud State and many others.

I hope MinnPost, or someone else in that pack, also asks questions that probe the values underpinning the opinions. For example, they could ask something like this:

“I’m going to read to you reasons some people have given for disapproving of the job Governor Mark Dayton has done. For each, tell me how compelling you find each of these reasons (very compelling, somewhat compelling, not very compelling, not at all compelling):

1) He has been too unwilling to compromise with legislative leaders;
2) He has been too quick to cave in to legislative leaders’ demands;
3) He hasn’t done enough to keep his campaign promise to create jobs;
4) His positions are too liberal;
5) His positions are too conservative;
6) His positions are too “middle-of-the-road;”
7) I don’t really know what he stands for as a leader;
8 ) He just doesn’t come across as a strong leader.

Similarly, they could ask a battery of questions probing public approval of “the Legislature as a whole” and “the legislators who represent you personally,” since those mindsets historically have tended to vary significantly. MinnPost can accommodate this deeper level of learning, because it’s online format allows for longer pieces, and its civically engaged readership will tolerate it.

If you don’t see the value in probing, look no further than surveys about health care reform. Many news media pollsters only ask whether respondents support or oppose the health care reform law. Consumers of those news media polls then usually jump to the conclusion that those who answered “oppose” want no reform, or less government involvement in health care.

But news outlets whose polls probe more deeply learn that many Americans actually oppose the health reform law because they want MORE government intervention – a public option or a single payer system — not less. For example, a CNN poll a few months back found only 43% supporting the health reform law. BUT, CNN also uncovered that 13% of opponents opposing it because it’s “not liberal enough.” That means, that 56% (i.e 43% plus 13%) either support the health reform law as is or want it stronger.

This is hardly the ringning endorsement of Tea Partyism that many news outlets, including CNN, were reporting at the time. It’s a perfect example why designing polls to learn about the “why” of an issue is so important.

Getting at the “whys” of the horserace poll questions is as important as getting at the “whats.” Is Dayton getting lower approval ratings because he is perceived to be compromising too little or too much? Is he considered too liberal or not liberal enough? Do most of Dayton’s detractors think he didn’t cut enough spending enough or do they think he should have spent more to create jobs? Is he losing approval because of his ideology or his leadership style?

This kind of probing delivers a much richer level of understanding than simple “approval/disapproval” style horserace questions. And that kind of news media polling would certainly be in keeping with the MinnPost motto, “a thoughtful approach to news.”

– Loveland

8 thoughts on “Expecting More From News-Sponsored Polls

  1. Gailkate says:

    This is a great idea, but somehow these probing questions need to be separated into more digestible chunks. You’re asking people to hear and comprehend a complex essay question.

    Without a doubt we need to tease out the distinctions people are not being allowed to voice, because polls don’t just reflect opinions, they push them. I remember being staggered by the polls that showed 85% of Americans supported the Gulf War. In truth, I believe 85% of Americans thought Saddam Hussein was a brutal pig, but they didn’t want a couple of hundred thousdand innocents to die in defense of Kuwaiti oil.

    1. Joe Loveland says:

      Re: “Digestible chunks”

      Very much agree. The sample question is effectively 8 questions, with a separate response given after each of the eight propositions. It’s not meant to be presented to the respondent in one chunk.

  2. Ellen Mrja says:

    Or you could ask: “In what ways do you think our governor, Mark Dayton, represented strong leadership to the average Minnesotan in the recent budget negotiations?”

    1. Joe Loveland says:

      You could, but open-ended questions are prohibitively expensive for most sponsors.

      Lots of different ways to wordsmith, and experienced pollsters are better at it than me. But the larger point is: Go deeper.

    2. Ellen Mrja says:

      You miss my point entirely, Joe. Your question was not a strong one because it was based on an assumption: ““I’m going to read to you reasons some people have given for disapproving of the job Governor Mark Dayton has done.” People disapproving of the job he’s done. That’s a loaded question.

      So you might as well ask a differently loaded question, including the words “strong leadership,” and you’ll get similarly skewed results.

      1. Joe Loveland says:

        I understand your point, Ellen, and it’s a very fair one. Sorry, I’m a little dense.

        I didn’t mean to imply that the sample question is the only battery of questions asked. Typically in this kind of analytical poll you would want to learn why respondents have BOTH positive and negative views on the subject at hand. But in the interest of space, I was only listing the one example. You could certainly include a battery of questions probing why respondents approve of Dayton, as well as why they disapprove of him.

        If a leader’s approval is headed downward, as Dayton’s is at the moment, the reasons for the decline strike me as most interesting and newsworthy. When a leader is getting more popular (e.g. Bachman in IA), I’m most curious about what is behind that. When a leader is getting less popular (e.g. Dayton in MN, Obama in US), I’m curious what is behind that. Personally, I’m more curious about the trend than the exception to the trend.

  3. Newt says:

    New organizations should stick to what they do poorly, and the least: Cover the FACTS.

    Opinion polls are voodoo and subject to mischief, misapplication and misinterpretation.

    1. Joe Loveland says:

      So they should pretend to tell the story of a democracy without reporting on what the demos think? I disagree. They have to cover public opinion or they’re ignoring the driving force of democracy.

      How do they cover public opinion? 1) Speculate about it (e.g. “my friends, neighbors and family say…); 2) report small sample anecdotes (e.g. Joe Man-on-the-Street says…); or 3) do statistically predictable random sample surveys.

      The latter is imperfect, and difficult and expensive to execute, but that’s the option that will get you closest to the truth.

Comments are closed.