Fact-checkers and critics reacted positively to Facebook’s latest moves to fight coronavirus misinformation on the platform, although some still believe more can be done.
Last Thursday, Facebook announced it would begin notifying users who liked, reacted or commented on harmful misinformation about COVID-19 that the platform has removed.
For example, users who’ve interacted with hoaxes about bleach, alcohol consumption or 5G conspiracy theories, which have the potential to cause physical harm to them or their communities, will be notified and directed to the World Health Organization for more explanation.
Facebook’s current strategy to fight misinformation started in 2016 when it launched the Third-Party Fact-Checking Program (3PFC). Facebook uses the International Fact-Checking Network’s Code of Principles to choose fact-checking partners from around the world who vet content on the platform. When content is considered false, Facebook notifies users who’ve interacted with the post and then reduces the content’s visibility. Facebook currently partners with fact-checking organizations in 76 countries and regions.
The biggest change in Facebook’s latest announcement is that it addresses content that’s been removed from the platform for violating its community standards. In a blog post, Facebook founder and CEO Mark Zuckerberg said, “if a piece of content contains harmful misinformation that could lead to imminent physical harm, then we’ll take it down.”
Fadi Quran, campaign director for the advocacy group Avaaz, called the move, “a key first step in cleaning up the dangerous infodemic surrounding the coronavirus,” but in an email to the IFCN, Quran urged the company to go further.
“An academic study shows that ‘correcting the record’ for users can decrease the number of those who believe disinformation by more than half, and Facebook can implement this solution today,” he wrote.
“I think it’s a good approach,” said Pablo Medina, director of 3PFC partner Colombiacheck. ”Facebook has to be more proactive in its platform. Though, as always, more should be done, especially to target those who don’t believe WHO.” Medina suggested potentially linking to trusted local government sources or including the work of local fact-checkers.
Natalia Leal, director of the Brazilian fact-checking network Agencia Lupa, said the new moves will definitely help users who have been exposed to dangerous misinformation to get data about content that fact-checkers haven’t had the opportunity to check. “Misinformation spreads so fast that it will be always a huge problem,” Leal said.
Will Moy, chief executive of British fact-checking organization and 3PFC partner Full Fact, said in an email to the IFCN that Facebook’s move could be a model for other social media platforms too.
“Now that Facebook has shown that they can do this, we will have to ask if other internet companies, and other media outlets, will do the same,” Moy wrote. Full Fact joined the 3PFC program in 2019, and in July of that year, it released a transparency report chronicling its early experience in the 3PFC program.
The report concluded 3PFC was overall beneficial in the fight against misinformation but made 10 recommendations for how Facebook could improve the program. They included increased data sharing with fact-checkers, adding more categories to the rating system and expanding the 3PFC program to Instagram. The company agreed with the suggestion to add fact-checkers to Instagram. They launched it in the U.S. in August 2019, but now the effort is global.
Moy said he hoped Facebook will share what it learns from these new moves to help others in the fight against misinformation. Full Fact is working on a follow-up to its 2019 report looking at Facebook’s recent policy changes, and assessing the 3PFC program over the past year. Moy said Full Fact plans to publish its report in the next few months.