Breaking news: According to an exclusive new telephone poll, just 31 percent of Florida voters believe their congressional representative deserves to be re-elected. More than a third find President Barack Obama’s demeanor “too cool and intellectual.” And slightly more than half would rather have Jon Stewart than Glenn Beck as a dinner guest in their home.
Oh, and 68 percent don’t trust polls.
I can exclusively reveal these results because I commissioned this poll myself.
I wrote the questions, I decided which households to call, and I recorded the computerized prompts, so the 107 Floridians who took the survey heard my recorded voice when they answered the phone. I ordered the poll online, the same way I order pizzas and Netflix, and I paid for it with PayPal. It cost less than $200.
I did this not because I really cared to measure attitudes about the president, members of Congress, or Stewart and Beck. Rather, I was trying to learn more about a fast-growing and controversial trend in public opinion research: do-it-yourself polling.
Websites have begun to spring up that allow users to create and run their own telephone surveys at a tiny fraction of the cost of hiring professional pollsters. The sites are becoming popular with political campaigns, the media and other groups. Precision Polling — the website I used to run my survey — says it makes up to 200,000 polling calls a day.
“It has the potential to be a big thing,” said Mark Blumenthal, the editor of HuffPost Pollster (formerly Pollster.com), which aggregates and analyzes poll data. Earlier this year, Blumenthal called do-it-yourself polling “a harbinger of our polling future.”
Of course, as with any do-it-yourself project, the quality of the finished product is dependent on the skill of the do-it-yourself-er. And by that standard, the results of my poll could be the social-science equivalent of a bad home perm or my flood-inducing effort to fix the garbage disposal.
With no professional guidance from a pollster, statistician or other expert, I can’t be sure my survey methodology was valid, my poll questions were unbiased or my sample was representative of the population. In short, while my poll was certainly economical, its results may be totally meaningless.
“I’m worried about people thinking that all you’ve got to do is put down a few hundred dollars and call for an hour and feel like you have a representative, accurate measurement,” Blumenthal told me in a phone interview.
“Just being able to get a list of phone numbers and make a list of calls,” he said, “does not yield a representative sample, much less a poll that anyone would want to put their name on.”
Leaders of the newfangled polling operations concede the potential pitfalls but say the risks of do-it-yourself polls are overstated.
“There’s no question that people could design a bad survey and run it through us,” said Dave Goldberg, the CEO of Precision Polling’s parent company, the online research firm SurveyMonkey. “But I think in general, the benefits outweigh the potential negatives, and most people will get pretty good quality results.”
Opening up the market
In many ways, self-service polling can be seen as part of the natural evolution of the Internet. Just as blogs have allowed anybody to become a pundit and YouTube enables anyone to distribute video, services like Precision Polling make it easy and inexpensive for anybody to become a pollster.
While conventional surveys by full-service research firms can cost $10,000 or more, Precision Polling says its average customer spends about $1,000. The company charges 10 cents per call for an automated poll like the one I conducted.
“It opens up the market for people who couldn’t afford polls to be able to do it,” Goldberg said. The company’s clients include many people who are new to survey research — including bloggers, advocacy groups, and dozens of local political candidates. (Precision Polling also is sometimes used by experienced social scientists, who find the self-service interface saves money on data collection.)
Meanwhile, Rasmussen Reports — a firm that’s already controversial for its prolific use of automated political polls — now allows customers to create their own surveys through the website of a spinoff company called Pulse Opinion Research. Pulse charges as little as $600 for a single-question poll and $1,500 to $2,500 for more detailed ones. Unlike the surveys that carry Rasmussen’s name, Pulse’s questions are not reviewed or approved by the company.
“All that Pulse does is take the questions, turn them around, and give them back to the client,” company president Scott Rasmussen said in a phone interview. “If you went and asked some off-the-wall question, Pulse would not vouch for your interpretation of the data or the reasonableness off the question.”
Rasmussen wouldn’t disclose the names of Pulse’s clients, though Fox News has used Pulse for several recent candidate polls.
Not surprisingly, the growth of do-it-yourself polling concerns traditional social science researchers — many of whom look upon the self-service upstarts roughly the same way a Harvard professor might perceive a correspondence course or an estate lawyer might view a will-and-testament kit.
“When people try to do things extremely cheaply and use these quick methods, there’s many, many shortcuts and many, many problems,” said Frank Newport, the editor in chief of Gallup and president of the American Association of Public Opinion Research, the industry’s professional association.
Newport worries that untrained pollsters can unwittingly write incomplete or leading questions, choose a sample that’s unrepresentative of the general population, or make other methodological errors. “Without professionals at all parts of the process, I’d be very sensitive about the validity of the results,” he said.
Promoters of the do-it-yourself model dismiss suggestions that poll design is too complicated for the average Internet user to master.
“A hundred years ago, they said the number of cars that would ever be sold would be limited because you could only train so many chauffeurs,” Rasmussen said.
He notes that his Pulse website offers a pre-designed template for pre-election surveys, where inexperienced polltakers do little more than insert candidates’ names. And Precision Polling and SurveyMonkey provide extensive online tutorials on poll design, with such tips as “be brief,” “avoid loaded questions,” and “make the survey interesting.”
“If you threw a random person in front of this and said, ‘Tell me who’s going to win a particular midterm election,’ they probably couldn’t do that,” said SurveyMonkey methodologist Philip Garland. “But I think there’s a way to get people the proper framework to do something good.”
A challenge for news consumers
Both supporters and critics of self-service polling agree that the practice will accelerate the proliferation of polls, especially the automated variety whose use is already skyrocketing. (It’s not unusual for a single Senate race to inspire three or four public polls each week.) That, in turn, places more burdens on news consumers to interpret and contextualize the flood of polling data.
“Before the Internet exploded, big networks and newspapers were gatekeepers for this kind of polling data,” Blumenthal said. “Now if you have a poll and you can e-mail it to the right people, it will end up on websites and start percolating through the Internet.”
To further complicate the situation, there’s little consensus among researchers about what makes a good poll. (For instance, some experts — and many news organizations — distrust automated telephone polls, while others have concluded that surveys conducted by machine are as accurate as those with live operators.)
In general, though, Blumenthal says consumers can evaluate a poll’s credibility by examining such factors as who paid for it, whether the questions seem loaded or misleading and whether its results are consistent with other polls on the same issue.
As for my own poll [PDF], Blumenthal suggested that it fell short in several ways: My polling sample, which I obtained from a Florida voter list, was small and likely skewed too old; people older than 59 made up two-thirds of my sample, while that age group accounts for only about a third of Florida voters. I had Precision Polling place all the calls on one night, but experts suggest spreading calls across multiple days and times.
And because I have no quantitative research knowledge beyond what I remember from my freshman college statistics class, I made no effort to weight my results to compensate for deficiencies in my sample.
“What you’re looking at are a hundred unweighted interviews of whoever answered the phone,” Blumenthal politely scolded me. “As you found out, you can do a poll without knowing the first thing about what you should know.”