November 17, 2017

JOHANNESBURG — Fact-checkers wrote down their wildest dreams on post-it notes, sticking them to the glass wall of an airy 12th floor office at Wits University. And by the time Mevan Babakar was finished presenting, most of them seemed to have come true.
 
Babakar, a digital product manager at Full Fact, was showing off the most recent iteration of the British fact-checking organization’s automated platform. The Thursday demonstration came ahead of a two-day summit hosted by Africa Check in Johannesburg, where about 20 journalists from around the world have gathered to discuss the future of fact-checking.
 
During Full Fact’s presentation, that future seemed nigh.
 
“Full Fact is one of the leading organizations in the world that is developing tools to make fact-checkers’ jobs easier,” said Peter Cunliffe-Jones, executive director of Africa Check, while opening the meeting.
 
The automated fact-checking platform is essentially a web dashboard that is divided by separate, tabular information streams. It automatically pulls caption data from news organizations such as the BBC, as well as audio from Parliament, which it then transcribes in real time.
 
But perhaps the most innovative part of the platform is how it displays fact checks. Using a manually programmed set of search queries and patterns based on specific words, the tool automatically scans each set of text for claim language that relates to any of the thousands of fact checks in Full Fact’s database. Then, once it finds a match, it displays the existing fact check in the right rail next to the claim, which can then be tweeted.

Full Fact screenshot
(Screenshot courtesy Mevan Babakar)

In short, the platform automatically pulls existing fact checks about statements that are repeated in the news and government, making it easier for Full Fact to re-promote them in real time.
 
“Fact-checkers are often small teams and they’re expected to do a lot,” Babakar said. “One of the goals is to make it possible to let fact-checkers work (faster) … It’s about cutting down the time it takes and not losing out on content.”
 
In addition to the Live part of the platform, Full Fact built in a section that keeps track of trends in recognized claims from the different information streams. Users can create visualizations depicting how frequently a statement has been made over time and — hopefully — whether or not their fact checks had an effect on its dissemination.

Babakar said that specific tool addresses an important need in the fact-checking community: tracking impact.
 
“With Trends specifically, what we’re trying to do is find a better way of measuring impact,” she said. “I think we’d like to know what it looks like on a wider plane — being able to say, ‘Here is the life cycle of a claim.’”

Full Fact screenshot
(Screenshot courtesy Mevan Babakar)

Full Fact has been a leading voice advocating for automated fact-checking for several years. Babakar said they first started thinking about building a new workflow ahead of the 2013 elections in the United Kingdom, when Full Fact still had a relatively small team. They wanted a way to quickly fact-check political claims that they’d already verified or debunked before.
  
“Within seconds, we would fact-check a statement that was just said,” she said, “and that was only because our fact-checkers had spent the past few months looking at actually everything that had been said. We had pre-prepared answers to various claims.”
 
In August 2016, the organization published an ambitious report detailing some live fact-checking tools that could make journalists’ lives easier, arguing that they weren’t so far away. More than a year later, it appears that Full Fact has successfully built out several of those ideas, albeit in a way that relies on manual inputs.
 
“It starts really, really simply,” Babakar said, “but even in it’s simplest form, it’s really useful.”
 
Full Fact isn’t the only fact-checking organization that has taken a stab at developing automated technology; Chequeado also worked with them (on a Poynter-funded fellowship) for a week in January 2016 to brainstorm tools for fact-checkers. 

Two notable projects — The Washington Post’s Truth Teller and a master’s thesis called Truth Goggles — both sought to automatically display fact checks alongside claims made in text and video, with mixed results. In September, the Duke Reporters’ Lab announced a $1.2 million project to build automated fact-checking tools and apps.
 
What makes Full Fact’s platform different is its support, methodology and inward-facing goal. The organization used $500,000 in funding from the Omidyar Network and Open Society Foundations to hire two additional developers, which — coupled with the fact that Full Fact’s team was already pretty techy — aided them in building out a working beta version of the platform, Babakar said. Additionally, the fact that the project was small in scale and intended for fact-checkers made the process smoother.
 
“Because we’re building tools for ourselves, we have the expertise in house,” Babakar said. “We were super realistic about what we could build … the tech we’re using right now is really basic and it’s been around forever — we’re just applying it in a way that’s meaningful and relevant.”
 
While the beta version of the live fact-checking platform was created specifically for claims relevant to the U.K., Full Fact has started working with both Chequeado and Africa Check to build out versions that are relevant to their audiences. That effort is hindered by the fact that news and legislative captions aren’t readily available in most countries like they are in the U.K., which Babakar said could be addressed in the short term by using (admittedly imperfect) transcription software.
 
Cunliffe-Jones said his team is hoping to incorporate Full Fact’s tool in its Nigeria and South Africa operations.
 
“The intention is for the tool to be developed into other languages,” he said.
 
But whether or not that Full Fact’s proprietary technology will be rolled out to fact-checking organizations en masse remains to be seen.
 
Babakar said that, while making its automated platform open-source would enable other fact-checkers to similarly hasten their work, it could also be weaponized by fake news purveyors or people who want to sow distrust online.
 
“It’s more contentious when you open-source a project when there is actively someone on the other side of the fence that’s trying to work against you,” she said. “There are people who are using the same tech for nefarious means.”
 
Full Fact is planning to release its platform in October 2018 and is looking for ways to provide its automated fact-checking tool for more organizations (including possibly partnering with the International Fact-Checking Network), Babakar said. Full Fact is also hoping to improve the tool by incorporating machine learning and voice-to-text technology, which both Google and Amazon have heavily invested in improving.
 
In a perfect world, that technology is five years away at the earliest, Babakar said. But automation platforms present numerous possibilities.
 
“There’s potential to build other things, too — these just seemed to be the most valuable to fact-checkers right now,” she said.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News