May 23, 2012

Two recent New York Times articles included significant numerical errors that elicited howls of protest from readers and critics.

In each case, the wrong number was core to the story’s central thesis, leading some to suggest the entire article should have been retracted or completely altered.

Both mistakes highlight how mistaken numbers, once revealed, can become the story, rather than the article itself.

First error: Wall Street psychopaths

On May 12, the Times published an opinion article, “Capitalists and Other Psychopaths,” that stated, “A recent study found that 10 percent of people who work on Wall Street are ‘clinical psychopaths’ and that they exhibit an ‘unparalleled capacity for lying, fabrication, and manipulation.’ ”

In the Daily Beast, Edward Jay Epstein tracked the origin of the claim to a report in The Week about the work of Canadian forensic psychologist Robert Hare. That piece had based its report on an article in CFA Magazine.

A game of telephone fools the Times” read the headline on a Columbia Journalism Review piece about the error. Sounds about right. Ryan Chittum added more detail:

In other words, the Times’s false information was sourced from The Week, which sourced it, via aggregated posts at master aggregators Business Insider and Huffington Post, from CFA Institute magazine which sourced it, erroneously, from “Studies conducted by Canadian forensic psychologist Robert Hare.”

Even worse, he noted, the 10 percent figure had been called out as fake two months prior to the Times piece.

“The problem here is that Hare never conducted a clinical study of the financial-service industry, and never presented evidence that 10 percent of its members were psychopaths,” Epstein wrote.

Yet this phantom number spread like wildfire, ending up in a very prominent place in a Times opinion piece.

The Times responded with a correction:

An opinion essay on May 13 about ethics and capitalism misstated the findings of a 2010 study on psychopathy in corporations. The study found that 4 percent of a sample of 203 corporate professionals met a clinical threshold for being described as psychopaths, not that 10 percent of people who work on Wall Street are clinical psychopaths. In addition, the study, in the journal Behavioral Sciences and the Law, was not based on a representative sample; the authors of the study say that the 4 percent figure cannot be generalized to the larger population of corporate managers and executives.

There’s a lot of detail — meaning big things being corrected — in the above.

First, obviously, 10 percent shrinks to four percent. Very notable.

But that new number comes with a major hedge. The correction explains that this weak four percent stat isn’t even specific to Wall Street workers. These are general “corporate professionals.” The Wall Street angle is no longer accurate.

In his piece about the error, Epstein quotes Ryan Holiday, author of Trust Me, I’m Lying: Confessions of a Media Manipulator,” as saying, “Headline-grabbing trend manufacturing such as this now dominates the pseudo-news cycle on the Web.”

That’s one take. Another is that this mistake, and its path to the Times op-ed, shows the way a claim, once published in one media outlet, can replicate itself in other reports. It also shows why relying on other media outlets for the purposes of sourcing and fact checking can be a risky proposition.

In the end, the result is an op-ed that at the very least lost one major data point in its argument. What remains is a viewpoint without a compelling stat to back it up. Does that negate the opinion? Or just make make it less persuasive?

Second error: Student debt

Also on May 12, the Times published a long story about student debt in America. It was part of special series, and it too included a significant numerical error. Here’s the correction:

An article on Sunday about college students’ debt, and an accompanying chart, misstated the percentage of bachelor’s degree recipients who had borrowed money for their education from the government, private lenders, or with the help of family members.

The article stated that the percentage had increased to 94 percent from 45 percent in 1993, based on data from the Department of Education, whose officials reviewed The Times’s methodology before publication. While the percentage of students borrowing for college has indeed increased significantly, the 94 percent figure reflected an inaccurate interpretation of the data, which came from a survey of 2007-2008 graduates.

That survey showed that 66 percent of bachelor’s degree recipients borrowed from the government or private lenders; an additional percentage of graduates had family members who borrowed on their behalf or who lent them money, meaning that the total percentage with college borrowing increased to more than 66 percent. But the precise figure isn’t known because the department survey did not address borrowing involving family members. (The earlier survey, of 1992-1993 graduates, found that just 45 percent of graduates had borrowed from all sources, including from family members.)

Long story, long correction.

The correction inspired a bit of personal reflection by Ben Wildavsky, the author of “The Great Brain Race: How Global Universities Are Reshaping the World.” He led with the fact that the Times’ error in his view undercuts the story:

For anybody who missed it, there was an edu-wonk brouhaha this week over an embarrassing error in the New York Times’ big series on student debt. The Times vastly overstated the percentage of students with debt – a particularly significant mistake given that this statistic was the linchpin of the story – then ran a rather defensive correction three days later.

The incorrect statistic about student debt was delivered in a critical paragraph of the story, high up. (The psychopath error was in the lead paragraph.)

What’s disconcerting is the story goes from having a definitive number — 94 percent — to a much lower number that carries less weight because it comes from data that are different than the initial benchmark (“the precise figure isn’t known”).

Would the story have been told differently if that stat was there all along? Would the paper have used such a wishy-washy number in its nut graf?

This is the thing with numbers in journalism: when they fit a thesis, they’re utilized as a major building block in a report. They often provide a data point to support anecdotal reporting.

So when those same numbers fail or are debunked, we shouldn’t be surprised to see them backfire to the point where they raise questions about an entire story.

Third error: megasecond

It’s a numerical error, but admittedly not of the same degree as the above. Still, this wonderful Times correction reinforces the fact that a mangled number can tell a very different story than the one intended:

An earlier version of this post included an erroneous reference to how long it took the people in the audience at “Death of a Salesman” to leap to their feet at the end. It was not a megasecond, which is one million seconds or about 11 days.

The lesson? In the world of numbers, as with language, precision is king.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Craig Silverman (craig@craigsilverman.ca) is an award-winning journalist and the founder of Regret the Error, a blog that reports on media errors and corrections, and trends…
Craig Silverman

More News

Back to News

Comments

Comments are closed.