May 10, 2019

Full Fact and Chequeado have long been at the forefront of using artificial intelligence to improve their work. Now, both fact-checkers have $2 million to help.

In a press release sent to Poynter on Wednesday, United Kingdom-based Full Fact announced that it — along with Chequeado, Africa Check and the Open Data Institute — won a Google.org grant to bolster their work using AI to partially automate fact-checking. It is one of 20 projects selected from more than 2,600 applicants for the Google AI Impact Challenge.

Over the course of three years, the four organizations will use AI to “dramatically improve and scale fact-checking, working with international experts to define how artificial intelligence could transform this work, to develop new tools and to deploy and evaluate them.” In addition to the $2 million, which the grant winners will share, that work will be bolstered by support from Google’s own in-house AI experts.

“In three years, we hope our project will help policymakers understand how to responsibly tackle misinformation, help internet platforms make fair and informed decisions and help individual citizens know better who and what they can trust,” said Mevan Babakar, Full Fact’s head of automated fact-checking, in the release.

The Google.org grant is among the largest single payouts for using AI to improve fact-checking — a sign that foundations and tech companies think automated fact-checking matters, and it’s here to stay.

In the past few years, Full Fact has developed a platform that automatically scans media and government transcripts for claims and matches them with existing fact checks in the outlet’s database. It also creates “robo-checks” that automatically match statistical claims with official government data. The Duke Reporters’ Lab and Chequeado have both built similar tools that notify fact-checkers about potential fact checks by scanning media transcripts.

The potential for these tools is great. Instead of watching hours of television or combing over every speech transcript for potential claims, fact-checkers can leverage AI to do some of the legwork for them. And that’s what several fact-checkers around the world are already doing.

In January of last year, The Washington Post Fact Checker published an article based on a claim that was surfaced by the Reporters’ Lab Tech & Check Alerts. The same thing happened with a Factcheck.org article published last month.

And beyond making it easier to research a fact check, automated technology could make it easier to publish fact-checks, too.

Automated fact-checking has come a long way. But it still faces significant challenges.

In an article published in The Atlantic this week, reporter Jonathan Rauch went behind the scenes of the Reporters’ Lab’s efforts to automatically fact-check Donald Trump’s State of the Union address this year. Once Trump said something that had been fact-checked before, a chyron appeared with the statement verbatim and a verdict.

While impressive, that system did have some misfires, and it isn’t yet ready to air during live TV broadcasts. And automated fact-checking still faces other major problems.

In a fact sheet published in February of last year, Lucas Graves, a senior research fellow at the Reuters Institute, wrote that most existing systems “can only identify simple declarative statements, missing implied claims or claims embedded in complex sentences which humans recognize easily.” At last year’s Tech & Check meeting, which convenes journalists and technologists at the Reporters’ Lab, participants identified a few other key problem areas:

  • Parsing through messy TV and government transcripts to find fact-checkable claims and identify speakers
  • Working around natural-language processors’ English bias to develop tools in non-English speaking countries, as well as keeping them up-to-date with political language
  • Getting and maintaining access to reliable official datasets
  • Avoiding duplication of efforts as fact-checking tools get developed around the world
  • Getting funding from major donors in regions where it’s less easily available

Google.org’s recent grant solves at least two of those problems — and could potentially address all of them. So how will Full Fact, Chequeado and Africa Check use their $2 million?

Babakar said it comes down to maintaining existing projects so that there’s room for growth.

“It’s buying time in each organization to define the problems and opportunities in the space, collect data around them and start to answer some of this with AI approaches,” she said in an email. “We have a big focus on transparently evaluating how well they’re doing in multiple political and societal contexts.”

“We want this to be a step change for researchers and data scientists in the space, and we want it to benefit fact-checkers everywhere.”

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Tags: ,
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News