Rothenberg: In Judging Surveys, It All Depends on the Meaning of the Word ‘Poll’
In some respects, there was nothing unusual about the Feb. 18 press release that made it to my desk Tuesday. “Warner Moves on, Kilgore Moves In” was the title.
But just as you can’t judge a book by its cover, you can’t judge a poll by its accompanying press release, even if it includes official-looking crosstabs.[IMGCAP(1)]
The survey, which raised a few eyebrows in Washington, D.C., found term-limited Virginia Gov. Mark Warner (D) leading Sen. George Allen (R) 48 percent to 41 percent in a hypothetical 2006 Senate race.
The release referred to the “exclusive Emerson College poll of likely Virginia voters.” It also said that David Paleologos of DAPA Research “oversaw the implementation of the poll.”
Initially, when I called Emerson College in Massachusetts, I was told by two people, one at the Office of Public Affairs and another in the Department of Organizational and Political Communication (which is apparently as close as Emerson gets to a political science department), that they had never heard of the poll and knew nothing about it, except that one of the names on the release was that of a student.
It quickly became clear that this poll wasn’t simply another university-affiliated survey, like polls emanating from Marist College, Quinnipiac University, the University of Connecticut or the University of South Alabama.
After some digging, I found that the survey was conducted by students in a class at Emerson, OP 303, to be specific, also known as Survey Research Methods. Students decided where they wanted to poll, constructed the survey document, made the calls and wrote up the press release.
The instructor in the class, the aforementioned Paleologos, told me that he monitored the class project “the entire time.” He has his own polling firm and is also the director of the Suffolk University Political Research Center, so he isn’t completely without credentials.
But is a poll conducted by an undergraduate class, however well-meaning the students and the instructor, on par with surveys conducted by university survey research centers or major media outlets? Obviously not.
And I’m not alone. “I find it completely inappropriate to describe it as an Emerson College poll,” David Rosen, Emerson’s vice president for public affairs, told me in a telephone interview. His tone suggested that he was not at all pleased with the press release in question.
Instead of calling it an “Emerson College poll,” Rosen chose to refer to it as “a class project.”
Of course, some people may give the “class project” considerable credence nonetheless, because it was overseen by a professional pollster. Again, I’m not so sure that that’s wise without gathering further information.
I had never heard of Paleologos before I saw the release, and most of what I’ve learned about him has come from the Web.
He is an adjunct professor at Emerson in addition to his role at Suffolk, and he has logged some TV time on Boston’s WHDH, which has occasionally partnered with Suffolk for polling. When I spoke with him, he certainly sounded like a pollster.
But my Web search of Suffolk poll results didn’t make me entirely comfortable that he could turn a classroom of undergraduates at Emerson into a team of reliable pollsters, though I applaud his efforts to teach students about polling and politics.
On Feb. 2, 2004, the day before the South Carolina Democratic presidential primary, Suffolk University and 7NEWS (WHDH-TV) released a poll that found Massachusetts Sen. John Kerry with a 10-point lead over North Carolina Sen. John Edwards in the Palmetto State, 25 percent to 15 percent. The rest of the field was in the low single digits. The Rev. Al Sharpton was sixth at 2 percent.
The actual South Carolina results showed Edwards first with 45 percent, followed by Kerry at 30 percent and Sharpton third at 10 percent.
Readers of the Suffolk survey must have been stunned. But the results weren’t a surprise to most other pollsters. The American Research Group, CBS News, CNN/Los Angeles Times, Insider Advantage and Zogby all had Edwards ahead in their pre-primary polls.
“Kerry is poised to win big in the coastal counties of Charleston and Horry,” proclaimed Paleologos in the press release the day before the primary. In fact, Edwards won Charleston County by 10 points (41 percent to 31 percent) and Horry County by 23 points (52 percent to 29 percent).
Many of Suffolk’s other polls may have been on the mark, and I’ve made my share of inept predictions over the years, including my “80 percent chance” comment that former Vermont Gov. Howard Dean would be the 2004 Democratic presidential nominee, so I’d never say that one bad poll result discredits a pollster. But since Paleologos isn’t widely known, reporters ought to be cautious about accepting his numbers until they know more about him and his record.
“A good pollster can poll anywhere,” a friend told me recently. But that doesn’t keep me from wondering about the reliability of a Virginia “poll” conducted by students in Massachusetts. Personally, I’d prefer my pollster to have some experience in the state where he is polling. (Paleologos told me that he had never polled in the state before, though previous classes had polled in North Carolina, New Hampshire and Los Angeles.)
At this point, I’m not even taking issue with the results of the “Emerson College poll” in Virginia. Many of the survey’s numbers seem reasonable, though Warner’s 39-point lead over Allen among independent and undecided voters was described by one Democratic consultant as “very, very fishy.”
It’s hard to know what polls and pollsters to trust, and even good pollsters produce a poll now and then that is outside the margin of error. That’s the nature of polling.
But I’m afraid the political and journalistic communities are too willing (even eager) to accept poll numbers, particularly when they are allegedly attached to an institution of higher education, even without information about the pollster and the methodology of the survey.
Not all polls are equal, and not all surveys deserve attention from the media or political junkies. And I cite the OP 303 poll as Exhibit A.
Stuart Rothenberg is editor of the Rothenberg Political Report.