Climate Feedback, a scientist-led effort to “peer review” the world’s climate journalism, is closing in on its $30,000 crowdfunding target. A successful conclusion to the campaign would bolster one of the most prominent efforts yet to conduct fact-checking via web annotation.
Annotation allows critics to add line-by-line comments to webpages in a corrective “layer” that doesn’t change the original content — but can certainly change minds. Last July, after Climate Feedback’s volunteer scientists shredded a particularly egregious Telegraph climate story, the paper revised the offending text. (The editors did, however, leave an inaccurate headline intact.)
Such results suggest that annotation could become a powerful force in fact-checking. But annotation has been around for quite some time, pre-dating the web for some time. So why does it always seem just about to take off? Are we just clearing a series of technical and social hurdles — or are we banging into a brick wall?
PolitiFact has been experimenting with the technology since early last year, when it posted a transcript of the 2015 State of the Union address on its website, and marked the text up using the annotation platform Genius. More recently, PolitiFact has started to annotate politicians’ posts on the blogging and publishing platform Medium, using a $140,000 grant from the Knight Foundation.
“Medium is this new space, and politicians who use it can avoid the scrutiny of the independent press,” PolitiFact executive director Aaron Sharockman says. “We’re trying to see essentially if we can make a bigger impact because we’re fact-checking right in the space where the claim is made.”
Annotation also works well for critiquing journalism, says Climate Feedback project scientist Emmanuel Vincent. The line-by-line feedback allows the original author to easily see where he or she may have gone wrong. This piecemeal approach helps to spread the workload between several experts, Vincent says.
But annotation in its current form presents several problems, Sharockman says. Crucially, most who read Medium posts by Joe Biden or Hillary Clinton will never see the PolitiFact annotations. “If you don’t follow us on Medium, you won’t see them unless the politician approves them — which we think will probably never happen,” Sharockman acknowledges.
Even most readers of PolitiFact’s website aren’t following the outlet on Medium yet, Sharockman says.
PolitiFact’s annotated State of the Union got hit by another fundamental problem: the fact checks got drowned out by readers’ comments.
“Some people would say if people are engaging in your annotation… then they’re on your site longer, that means they’re interested,” Sharockman says. “I understand that, but we’re trying to create a permanent record that people can access for years to come… What it turned into is essentially a soup of people’s opinions.”
Vincent says he’s not bothered about the effects of public annotation — such comments have yet to overwhelm any of his team’s annotations – but Dan Whaley, founder of the annotation startup Hypothesis describes the issue as “super urgent.” The organization is hoping to finish development this year on an optional feature that will make annotations open for public reading, yet closed for public comment.
Some familiar with annotation point to additional roadblocks. Dan Gillmor, who teaches media literacy at Arizona State University’s Walter Cronkite School of Journalism and Mass Communication, argues, “Audio and video totally don’t lend themselves to [annotation]” — yet, Gillmor says, those are the most popular sources of news.
Rob Ennals, who ran Intel’s now-defunct Dispute Finder extension argues that annotation is having trouble building an audience — and the critiques simply don’t happen fast enough. By the time an article has been rebutted, he says, it’s often fallen off the news cycle.
“My eventual conclusion was that the web annotation approach doesn’t really work — which is why it hasn’t taken off,” says Ennals, who now works at the crowdsourced question-and-answer site Quora.
Hypothesis does plan to develop audio and video annotation, Whaley says, and the problem shouldn’t be too challenging. But, he argues, text is more likely to stand the test of time as a reference. And while he acknowledges that annotation must work to attract audiences, Whaley is confident it can do so by following the Climate Feedback model — that is, facilitating “communities of experts.” As these communities grow, Whaley says, response time will drop.
Vincent agrees — he hopes Climate Feedback will begin to turn many of its critiques around in a day, down from the current two or three.
But Gillmor raises another concern. “Only the most devoted reader is going to sort through all of the stuff in an annotation,” he argues. Annotation’s biggest potential, Gillmor says, may be in alerting authors to their mistakes.
While Whaley points to Climate Feedback’s summary ratings as one way to engage hurried readers, he also agrees with Gillmor: the key audience for line-by-line rebuttals is the piece’s original author.
“What we’re trying to do is make it more and more uncomfortable for journalists and others to write articles that don’t stand up to analysis, and for publishers to support them doing so,” Whaley says. “We don’t need everyone to read the annotations in order to have this effect.”