August 7, 2018

When someone tweeted a misleading news story about a Brazilian presidential candidate earlier this month, Fátima sprang into action within a few hours.

“Hey! This news distorts and manipulates statements that Marina Silva gave to GloboNews,” the account tweeted Aug. 1.

Since then, Fátima has shared the same fact check with Twitter users more than 30 times. The account publishes similar debunks about 15 times a day — all without human interference.

Built by Aos Fatos, Fátima (shorthand for “fact machine”) scans Twitter for fake news stories that the Brazilian fact-checking project has already debunked, which are contained in a database of more than 1,000 URLs. Then the bot automatically replies to people who share misinformation with a link to the relevant fact check.

In Brazil, stories like the one about Silva’s comments have become a staple of misinformation leading up to the October general election. And since launching about three weeks ago, Tai Nalon said Fátima has already seen some success in limiting the spread of hoaxes on Twitter — which has more than 25 million users in the country.

“People go (to Fátima) to pressure the person who posted the fake news content into deleting it. They endorse what Fátima is saying,” the Aos Fatos director said. “That’s really crazy because I did not expect that to happen.”

“We’ve seen people deleting their tweets and thanking Fátima for the warning, which is something quite unusual.”

Aos Fatos received R$150,000 (more than $45,000) from Facebook to build Fátima out on the platform, and another R$100,000 from an independent organization for the Twitter bot. And it’s just one example of how fact-checkers and journalists are using technology to scale their anti-misinformation efforts on social media ahead of the Brazilian election.


RELATED ARTICLE: With help from Facebook, these two projects are tackling fake news in Brazil


On Monday, a collaborative fact-checking project launched by First Draft and aimed at cutting down on the spread of fake news ahead of the election started publishing. The coalition, named “Comprova” and made up of 24 newsrooms around Brazil, is the first to use the WhatsApp Business API to debunk viral misinformation on the private messaging app.

“It’s designed for bigger companies. The question is: Will that work in a journalism and fact-checking way?” said Claire Wardle, executive director of First Draft, a project of the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School of Government. “We couldn’t have done this, really, without the ability to have multiple feeds.”

Several fact-checking outlets around the world have used WhatsApp Business to ask users to send them potential misinformation on the app, which they then fact-check and distribute articles in kind. The Android app lets fact-checkers add in-depth descriptions with contact information, respond to anyone who messages them and access basic analytics, which makes it a little easier to fact-check on the encrypted platform.

According to a First Draft press release sent to Poynter, the WhatsApp Business API will give Comprova (which means “with proof” in Brazilian Portuguese) the ability to have multiple users operate the same account at the same time. Without it, fact-checkers can only have two people run a single WhatsApp account at the same time — one on a smartphone and one on a computer.

Wardle said the API will also let Comprova journalists broadcast messages to their subscribers all at once, instead of posting the same debunk in several different groups, which are capped at 256 people. And aside from helping to scale fact-checkers’ work on WhatsApp, which has about 120 million users in Brazil, she said the partnership will allow researchers at the Kennedy School to gain more insight on how misinformation and fact checks spread on the app.

“What’s interesting to me about WhatsApp is that, as journalists, we want to lump WhatsApp in with all the other social platforms — but you have to think about this as a sociologist,” Wardle said. “We have to have ambassadors that are feeding in tips all around the country from the groups that they’re in.”

“That’s a huge challenge and I don’t think anyone’s got that quite right yet across the world.”

Comprova is being funded in part by the Google News Initiative and Facebook, both of which will also lend technical support to the coalition. Meanwhile, fact-checkers are also trying to limit the spread of fake news on the latter platform.

Both Aos Fatos and Agência Lupa, another fact-checking project, joined Facebook’s anti-misinformation project in May, along with the Agence France-Presse’s Brazilian operation. The partnership allows them to debunk and flag fake news on Facebook, limiting its future reach in News Feed by up to 80 percent. (Disclosure: Being a signatory of the International Fact-Checking Network’s code of principles is a necessary condition for participation in the project.)

And that effort goes beyond just limiting the spread of misinformation on Facebook, which has about 125 million monthly users in Brazil — fact-checkers are also trying to make people more scrupulous news consumers.

On Aug. 15, Agência Lupa will debut a Facebook bot that automatically answers questions about the veracity of political statements and viral fake news stories. The bot, “Projeto Lupe!,” was inspired by a Messenger model tested by Le Monde’s Les Decodéurs during the 2017 French election and is similar to another one that Aos Fatos plans to launch soon. Facebook is funding it with R$250,000 (about $75,000).

Director Cristina Tardáguila told Poynter in a WhatsApp message that Projeto Lupe!’s main value will be in keeping users updated on their latest fact checks.

“Facebook users will be able to search claims by candidate, by any specific topic or a time frame and see Agência Lupa's best fact-checked articles. They will be able to set up push alerts,” she said. “This bot will be connected to our website and will be automatically feeded everytime we publish something. (During) the election, that means many times a day.”

But whether or not all these projects will have a measurable effect on how misinformation affects the outcome of Brazil’s election remains to be seen.


RELATED ARTICLE: With money from Facebook, this Brazilian fact-checker created a Messenger bot for the election


Nalon said Fátima has a few limitations to what it can and can’t do. People have asked her why the bot doesn’t work on WhatsApp, which has become a primary source of misinformation for Brazilians. (The app doesn’t support such technology.)

“WhatsApp is still a black box — we don’t know how many people have received those kinds of hoaxes, we don’t know anything about what’s going on there,” Nalon said. “I’m glad that we are working on other social media, but we have a huge gap to fill on WhatsApp.”

Partisanship is another ongoing challenge for fact-checkers in Brazil, who were harassed online by members of the far-right after joining Facebook’s fact-checking project. And after the tech company banned a network of inauthentic accounts and pages in late July, a right-wing movement quickly protested the decision.

Some people have blocked Fátima after it’s corrected them, meaning they can’t see any future replies from it. And in some cases, the bot and others like it might actually make things worse.

“I feel bad because it could amplify polarization,” Nalon said. “I don’t think people want to get harassed by sharing fake news, but it’s something that is good, too.

“People want to see others getting corrected — common people, not politicians or just the media.”

Clarification: Facebook gave Aos Fatos R$150,000 to build Fátima on Facebook, not Twitter. That bot receives different funding.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News