October 17, 2018

It was kind of a last resort.

For months, Cristina Tardáguila had been monitoring misinformation on WhatsApp in Brazil. The private messaging platform, which has 120 million users in a country of 200 million, is a major source of fake photos, videos and stories about this month’s presidential election.

Tardáguila, director of the fact-checking organization Agência Lupa, employed a method that many other fact-checkers around the world use: Ask your readers to send you potential misinformation, fact-check it and then distribute back to your readers. Since WhatsApp is encrypted, there’s no other way to do it — not even the company itself can see when or where something is shared on the platform.

But after weeks of silence from WhatsApp on what it was doing to tackle misinformation ahead of Brazil’s election, Tardáguila got fed up. So she published an op-ed in The New York Times asking the company to do something.

“Every time, WhatsApp says they’re not going to comment. That’s why we decided to go to the biggest megaphone in the world, The New York Times — so that WhatsApp can hear us,” she said. “Let’s see if they can comment now.”

In the op-ed, Tardáguila, along with researchers at the University of São Paolo and the Federal University of Minas Gerais, asked WhatsApp to take three emergency actions until the final round of voting Oct. 28. They include:

  1. Ban the ability to forward messages to more than five people, similar to what has been done in India.
  2. Limit the number of people you can broadcast messages to in groups. Currently, Brazilians can broadcast to 256 people and those people can broadcast the same message to another 256 people.
  3. Cap the number of members in new groups at less than 256.

“It’s just until the 28th of October — it’s not forever,” Tardáguila said. “It’s just too much. WhatsApp is dividing the country into two big groups.”

In the op-ed, Tardáguila wrote that she asked WhatsApp to implement those changes this week and the company responded by saying there wasn’t enough time. Poynter reached out to WhatsApp for comment but had not heard back as of publication.

The move comes amid a misinformation-marred election in Brazil. During the first round of voting Oct. 7, hoaxes about voter fraud were rampant on WhatsApp — one fact-checker went as far as to call it the story of the election. Fact-checkers themselves have become targets of online harassment campaigns and Facebook has removed several pages and accounts that were found to be spreading misinformation.

 RELATED ARTICLE: In Brazil's presidential election, hoaxes about voter fraud run rampant

And while many fact-checkers agree that WhatsApp was a primary driver of many of the viral conspiracies they're debunking on a daily basis, they've been unable to quantify their reach — until this month.

Wednesday’s op-ed was based on details from an analysis (sent to Poynter) in which Lupa, for the first time, quantified the reach of misinformation on the private messaging app. Using a monitoring system developed by researchers at UFMG, the fact-checking project was able to look at the top 50 images that appeared most often in 347 public WhatsApp groups between Aug. 16 and Election Day.

It found that eight of the 50 images were completely false, four were completely true and 16 were real but taken out of context; the rest were somewhere in the middle. Lupa also found that both the left and the right shared misleading or false photos on WhatsApp.

“It gives you a sense of how big misinformation is in WhatsApp in Brazil,” Tardáguila said.

The UFMG system, called Eleições Sem Fake (“Elections Without Fake”) and developed by a group of five researchers and students in the computer science department, pulls data from its sample of public WhatsApp groups related to politics. Fabrício Benevenuto, an associate professor, told Poynter his team finds the groups — which administrators make public by sharing a URL on the internet — by crawling the web for public links.

“What we have built is a system that monitors public groups,” he said. “When activists or people simply create a group, put the group on Twitter or Facebook and say, ‘Hey, you want to support this candidate, you want to help this candidate win? Just join the group and help us.’”

“There are groups for all the politicians in Brazil, even those without enough votes (to win).”

After identifying public groups, UFMG researchers use five separate phones and WhatsApp numbers to join them so that they can see what’s being discussed. They manually weed out any groups that aren’t political, Benevenuto said.

Being in just one group, a user can’t see how many times a specific image or video has been shared. But by using Eleições Sem Fake, fact-checkers can see how many times a specific post or piece of content has been shared in other public groups.

“It shows what’s inside these groups. For example, at the top (of the dashboard), you can see an image that appeared in 40, 50 groups in one day,” Benevenuto said. “The focus is more to help the journalists and understand what’s going on inside the groups and generate alerts into what’s affecting the Brazilian election.”

All content shared by the groups in Eleições Sem Fake’s database are contained in an online dashboard on UFMG’s website, which Poynter was granted access to. From there, you can see which posts were shared the most on any given day and see in which groups they appeared. You can even do a reverse image search on photos.

(Screenshot from Eleições Sem Fake)

The system works by pulling from data that the Eleições Sem Fake team downloads from its WhatsApp accounts each day, Benevenuto said. Almost the entire process — which the researchers did not get input from WhatsApp on — is automated.

“Everything is automated. Just the step of entering the groups is what we do manually,” he said. “And then we just see per day and say, ‘OK this same image has appeared in different groups.’ Just count how many times it appears in different groups.”

(Screenshots from Eleições Sem Fake)

In that way, Eleições Sem Fake is an unprecedented look at how information is shared on WhatsApp. But the system only analyzes a tiny fraction of the public groups on the platform.

“We are doing it like this, but there are thousands of groups,” Benevenuto said. “In the end, we just decided to (search for) the names of politicians.”

Based on what he’s learned about how misinformation spreads with the monitoring tool, Benevenuto said he thinks WhatsApp should more carefully regulate how users share messages between different groups. Since the same kinds of misinformation often jump between public groups, he thinks that false content quickly trickles down to private chats due to the fact that users can forward and broadcast messages at high volumes.

To Benevenuto, having WhatsApp commit to the three actions he outlined with Tardáguila in The New York Times would be a good way to prevent that from happening Oct. 28.

“To WhatsApp, I just suggested to restrict information from spreading. It’s like containing some pandemic,” he said. “It’s like putting Brazil in quarantine.”

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News