© 2024 NPR Illinois
The Capital's Community & News Service
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Illinois Issues
Archive2001-Present: Scroll Down or Use Search1975-2001: Click Here

Polling Pains: Polling Blunders Happening Nationwide

Pollsters this spring predicted Bruce Rauner would have a 17-point lead over then-state Sen. Kirk?Dillard, but Rauner’s victory was by less than three percentage points.
WUIS/Illinois Issues

 In the final month of the 2010 campaign, the political career of Gov. Pat Quinn appeared to be nearing its end. Poll after poll showed him down by four, six or even eight percentage points. The mathematical models of stat-geek superstar Nate Silver — fresh off correctly predicting the outcome in 49 of 50 states in the 2008 presidential election — gave Quinn just an 18 percent chance of winning. You know how that turned out.

Republican primary polls hadn’t been very predictive that year either, suggesting state Sen. Bill Brady would finish no better than third when in fact he won the nomination. Primary polling was also off this spring, giving investor Bruce Rauner up to a 17-point lead over then-state Sen. Kirk Dillard. Even the Quinn campaign accepted the inevitability of Rauner’s victory, running an ad against him on election night, long before the outcome was certain. Rauner’s actual margin of victory was less than three percentage points.

Three of the most important recent races in Illinois politics have confounded pollsters. Is there something special about this state that makes it particularly hard to poll?

Not really. Illinois has had several unexpected election outcomes in a row, but it’s certainly not alone. Mitt Romney went into presidential election night 2012 with the confidence of a man whose internal polls showed he ought to win. And just this year, U.S. House Majority Leader Eric Cantor was upset in a race in which polling had given him a 34-point lead.

“Illinois isn’t special in terms of being able to accurately predict election results,” says Ashley Kirzinger, director of the Survey Research Office at the University of Illinois Springfield. “I think that the pollsters and the surveyors (in Illinois) are having the same problems that other pollsters are having nationally.”

Some of those problems are specific to each election, like determining whether members of this or that party or demographic will actually show up on election day and weighting polls accordingly. That’s made more difficult by another set of problems that are more systemic, challenging the way polling has been done for decades. They include the move away from landline telephones, the prevalence of caller ID and the drive to keep costs down. Pollsters agree they need to change, but just how has become a source of friction in the industry.

As recently as 15 years ago, getting a decent sample of voting-age Illinoisans was relatively straightforward. Most of the people one would want to reach had a landline telephone. Whether man or woman, young or old, black or white — “Just call, we’re all connected,” as the old Illinois Bell commercial had it. Since then, the world has gone wireless, and polling has become more complicated and expensive. “To actually get a telephone sample of Illinois likely voters means purchasing landline and cell phone records, calling people almost a dozen times to get them to answer the phone — and then to actually get them to stay on the phone and take the survey,” Kirzinger says.

Young people, by and large, do not have landline telephones. And even when they do, perhaps as part of a package deal that includes Internet access and cable TV, they might not have that traditional phone plugged in. “That population, about 10 years ago, they were just starting to become part of the Illinois voting population,” Kirzinger says. “Now they’re in their late 20s, early 30s, (and) they still don’t have landlines, so you still can’t reach them that way.”

The only way one can reach them is on their cell phones. But even purchasing cell phone lists is not a sure-fire way to get at those voters. One of the biggest obstacles: caller ID. Kirzinger says when people see a number they don’t recognize, they often don’t pick up the phone. “I know that I’m guilty of that,” she says. Youthful mobility also plays a role. Kirzinger says her mobile phone has an area code from Kentucky. “I’m not voting in Kentucky; I’m voting in Illinois. But it’s impossible to reach me on a cell phone because there’s no way to know that I’m no longer in Kentucky and I’m actually an Illinois resident,” she says.

With a poll seeking the opinions of a thousand respondents, Kirzinger says you have to call between 10,000 and 20,000 people. That costs about $30 per respondent via landline, and $45 per respondent via cell phone. It also might be in the field for seven days, since she says each number will be tried at least four times to control for something called “non-response bias.” Kirzinger says the person who responds to a survey on the first call is inherently different than someone who finally answers on the fifth call. “If you only rely on the people that answer the phone the first time, those are probably the people that are the party faithful,” she says. “They’re the ones who are always going to vote. So you’re not going to get the more swing voters or the ones that only lean a certain way, but aren’t strong Democrats or strong Republicans.”

Working in academia, Kirzinger can afford to spend that kind of time on a poll. In the world of media and professional politics, there’s greater pressure to produce results quickly, and for less money, so expecting to spend a week on a survey can be difficult. “It’s unrealistic only from a customer standpoint,” says pollster Gregg Durham. “Customers usually — in this generation of pollsters — they want instant results.” Durham runs the We Ask America polling outfit, part of Xpress Professional Services, which he describes as an independent, for-profit subsidiary of the Illinois Manufacturers’ Association.

While old-school pollsters tend to use live interviews, others mix in automated calls and email polls. Durham says We Ask America uses all three techniques, though most that appear in the press are automated: “My view of it is: It doesn’t matter how you talk to somebody, it matters who you talk to.” He says an advantage of automated polls is that respondents get asked the question in exactly the same way — the names are pronounced correctly, there’s no variation in inflection of the words. He says for head-to-head questions — such as will you be voting for A or B in the race for governor — automated polls are best.

Those sorts of polls are often conducted quickly. Rather than the seven days Kirzinger prefers, a We Ask America poll before this year’s Republican primary was in the field for just one day. That’s the poll that had Rauner winning more than 44 percent of the vote to Dillard’s 27 percent. It was way off the mark, and gets back to the difficulty pollsters can have in predicting which voters are likely to show up on election day.

“The eleven secret herbs and spices of every pollster is the weighting formula,” Durham says. That means if one party or demographic is over-represented among people who responded to the poll, a pollster might expand or reduce that group’s share of the total responses. “One of the dangers, and this happened last time around with a lot of pollsters — including, publicly, Mitt Romney’s pollster … is that people assume certain things.” In the case of Romney, that meant assuming President Obama would not inspire as many Democrats to vote, particularly African Americans, as he did in 2008. Likewise in this year’s Republican primary for Illinois governor, pollsters did not adequately take into the account how many Democrats — particularly union members — would cast Republican ballots to vote for Dillard and against Rauner, who had spent much of the campaign bashing “government union bosses.” It’s also worth noting that We Ask America’s pre-primary poll was automated. “Automated polls can skew a certain way based on the type of individual that’s actually willing to sit there on the phone and press numbers based on what a computer is telling them,” Kirzinger says. “That population tends to be older, they tend to be not minority populations and they tend to lean a little bit more Republican.”

Charles Leonard, who directs polls for the Paul Simon Public Policy Institute at Southern Illinois University Carbondale, offered an additional explanation: “Primaries are goofy.” Get a few hundred people fired up in a low-turnout election, and one’s poll results go out the window. “Turnout is a funny thing. And one of our problems is we try to do likely voter and turnout models, but people lie,” Leonard says. “We get 70 percent of our sample swearing to God they’re going to vote in a Congressional-year election.” Of course nowhere near that number will actually show up at the polls. “Trying to guess which ones will and which ones won’t is something we haven’t figured out how to do yet,” he says.

As response rates decline for telephone polls, the industry is searching for a new way of doing business. The New York Times and CBS News made waves this summer when they announced they would begin using online panels as part of their polling mix. “As the young voters who are less likely to respond to telephone surveys become an ever-greater share of the population over time, it is probably more important for analysts to have an ensemble of surveys using diverse sampling and weighting practices,” writes Nate Cohn, a reporter, on the paper’s website. As part of that, the Times will be working with YouGov, which Cohn writes has “tracked many of its respondents over months, if not years, which gives it additional variables, such as a panelist’s voting history, to try to correct for non-response.”

Leonard says many in his field find the move “kind of perplexing, and we don’t all quite approve.” That’s because online polls do not have random-digit dialing, which the Times acknowledges has been the “gold standard for public polling.” Leonard says the key to conducting a poll that can be generalized from a small sample to the population at large is statistical randomness. “Ideally that means every member of the population should have an equal chance of being chosen,” he says. “And virtually everybody has a phone. Not virtually everybody can join the Internet panel.”

Cohn explains how the Times is addressing this concern: “YouGov attempts to build a large, diverse panel and then match its panelists to demographically similar respondents from the American Community Survey, an extremely rigorous probability survey conducted by the Census Bureau. This step is intended to mimic probability sampling. But it can require significant assumptions about the composition of the electorate, including partisanship.”

Indeed, many questions about the validity of Internet polling remain: Are people self-selecting into Internet surveys? Does the company go to the time and expense to make sure respondents can only weigh in once, so they can’t, as Leonard put it, “log on and go ‘Quinn, Quinn, Quinn, Quinn’ a bunch of times.” He says he expects The New York Times will spend sufficient money to conduct its online panels properly, “but many of us are nervous about generalizing from an Internet survey to the entire population.”

Scott Keeter, director of survey research at the Pew Research Center, echoed those concerns in a Q&A posted on Pew’s website: “It’s worth remembering that not everyone is online: According to our most recent estimates, 89 percent of U.S. adults use the Internet. The good news from a polling perspective is that figure is steadily increasing — it was 79 percent just five years ago — so the potential bias from excluding nonInternet users is getting smaller and smaller.”

But both Keeter and Leonard say the pollsters know a lot more about the problems with random-digit dialing than with online panels — mainly non-response because of the spread of cell phones. Keeter says the problem with phones cannot, on its own, justify switching to another survey method. “The alternative has to prove itself to be accurate enough and precise enough for the purposes to which we currently apply [random-digit dialing] telephone surveys,” he says. Leonard says he expects all polling will someday be conducted on the Internet but adds that few in his cohort are ready to make that leap.

For news consumers, it can be hard to know which polls to trust. Experts say the best practice is to not limit oneself to just one poll or pollster. By design, at least one out of every 20 polls should be off the mark — that’s what the statistical standard of a “95 percent confidence” level means. “One poll can be an outlier. It’s designed to be an outlier,” Durham says. “The whole process is designed to, on occasion, throw a bad poll in there.” Or, if pollsters make even a handful of bad assumptions about who the likely voters are for a given election, a series of bad polls. Ultimately, there’s only one poll that counts for keeps, and that’s when voters cast their ballots on Election Day. 

Illinois Issues, October 2014

Brian Mackey formerly reported on state government and politics for NPR Illinois and a dozen other public radio stations across the state. Before that, he was A&E editor at The State Journal-Register and Statehouse bureau chief for the Chicago Daily Law Bulletin.
Related Stories