How users see Facebook’s labels will determine their effectiveness

This is the July 23, 2020 edition of Factually

July 23, 2020 and
Category: Fact-Checking,IFCN

Factually is a newsletter about fact-checking and accountability journalism, from Poynter’s International Fact-Checking Network & the American Press Institute’s Accountability Project. Sign up here

What does a label really mean?

This week, Facebook attached “Get Voting Information” links to posts by both President Donald Trump and former Vice President Joe Biden as part of its larger push to promote accurate election information on the platform. These additions come two months after Twitter attached a similar label to one of Trump’s tweets, which some at the time characterized as an attempt to fact-check the president.

In June, CEO Mark Zuckerberg announced politicians would not be exempt from this new labeling policy, and that the company was committed to fighting voter suppression on its platform. However, this latest attempt has raised questions about the effects of the labels and how users will perceive them. Here are three to consider:

1) Will users know the difference between a label and a fact-check?

We can’t say it enough: A label is not a fact-check. Twitter said as much when it applied a label to the Trump tweet in May. Susan discussed this in the May 28 edition of Factually, and predicted more fights to come over these labels. The question is how users will see the Facebook labels. Even though they’re not fact-checks, will they inadvertently send a signal that the content is questionable?

Perhaps not. Facebook’s users are already exposed to fact-checks produced by members of its Third-Party Fact-Checking Program. Posts reviewed by fact-checkers are given one of nine labels along with an explanation of why the post was fact-checked. (Disclosure: Facebook requires its fact-checking partners to be signatories to the International Fact-Checking Network’s Code of Principles before they can work with the company. You can read about the whole process here.)

2) Alternatively, will the labels seem like endorsements of the content being labeled?

Among people who expressed that concern are election law expert Rick Hasen, a professor of law and political science at the University of California, Irvine.

“This warning seems pretty useless,” Hasen tweeted. “It might even seem that Facebook is endorsing what Trump is saying and providing a path for more information.”

Facebook contends its only goal is to increase participation by helping its users register to vote in the upcoming elections.

3) Will Facebook’s labeling be consistent?

Already there are battles brewing over whether Facebook will apply the labels consistently. Members of the Biden campaign have taken issue with the Facebook label. They argue Trump’s post falsely asserts there will be rampant fraud with mail-in voting, yet it is given the same label as a Biden post urging his supporters to vote against the president in November. However, Zuckerberg has argued speech by politicians, whether said on Facebook or disseminated in the news media, deserves to be debated by the public, and he has resisted calls to remove false political claims.

Right now, the voting label is being applied equally to all posts that mention voting as a way to encourage people to get more information about casting their ballots. In his June blog post, Zuckerberg promised Facebook will take down any false claims to discourage voting, adding politicians will not be exempt.

– Harrison Mantas, IFCN

. . . technology

  • Twitter Tuesday said it was taking down accounts associated with the conspiracy theory QAnon as part of enforcement action against “behavior that has the potential to lead to offline harm.”
    • NBC News’ Ben Collins and Brandy Zadrozny reported that a Twitter spokesperson said it had taken down more than 7,000 QAnon accounts and that the company’s decision to stop recommending content and accounts related to QAnon would affect about 150,000 accounts.
  • Last week’s Twitter hack, a bitcoin scam, provoked a lot of concern among experts about potentially more catastrophic scenarios involving disinformation. 
    • “What if the hackers had been seeking not profit, but conflict? What if, rather than tweeting a crude bitcoin hoax, they had tried to provoke war?” wrote Heather Williams, a lecturer in defense studies at King’s College London, in The Washington Post.
    • CNET quoted Joan Donovan, the research director at Harvard’s Shorenstein Center on Media, Politics and Public Policy as saying that the hack should make people question everything on Twitter.

. . . politics

  • The UK government “badly underestimated” Russia’s ability to sow disinformation, according to a new report from the Intelligence and Security Committee of Parliament, the BBC reported.
    • The BBC quoted a member of the committee as saying that the government “took its eye off the ball, because of its focus on counterterrorism,” and that “the government had badly underestimated the response required to the Russian threat, and is still playing catch up.”

. . . science and health

  • Facebook said it has removed a group called “Unmasking America,” which The Verge reported was one of the largest anti-mask groups on the social media platform.
    • The Verge’s Kim Lyons wrote that the group’s page included photos of people wearing masks that said “Make America Great Again” and other slogans or references to President Trump.
    • “Other posts described experiences dealing with stores that require masks, and many posters asked how to claim an ‘exemption’ from mask rules,” Lyons wrote.
  • The Christian Science Monitor profiled a doctor in Burkina Faso who is battling coronavirus misinformation with a radio show and a group of volunteers going door to door.

 

There is no billboard sponsored by the pharmaceutical company Merck urging people to get a vaccine if they are “tired of mask idiocy and social distancing.” And there is no “Taketheshot.gov” website.

AFP fact-checker Claire Savage debunked the photo, which doesn’t use the words coronavirus or COVID-19, but is nonetheless designed to suggest that the pharmaceutical industry is circulating propaganda to convince people to get a vaccine against the virus (even before it is available).

The “billboard” photo is amateurish, but that hasn’t stopped it from being circulated. It has made the rounds on Facebook since early July. Savage found examples in the United States and Canada and noted that even the prominent anti-vaxxer Robert F. Kennedy, Jr. shared it on Instagram, though he voiced doubt that it was real.

What we liked: Savage’s sleuthing made this fact-check work. Through a reverse image search she found stock art of a fake blank billboard that the hoaxers used for their handiwork. She also tracked down clipart of a cartoonish doctor used in the image, and she confirmed with Merck officials that the billboard wasn’t real.

– Susan Benkelman, API

  1. Here is a service for reporters and others tracking COVID-19 data. ProPublica’s Caroline Chen and Ash Ng have put together a helpful explainer on how to read case counts, hospitalization data and other numbers associated with the virus.
  2. The CEOs of Facebook, Apple, Amazon and Google will testify before a House subcommittee Monday. Antitrust and market dominance are the main topics, but as Axios reported, lawmakers “are sure to press the platforms on how they’re controlling misinformation on both paid advertisements and people’s posts online.”
  3. More than 280 staffers at The Wall Street Journal and Dow Jones have called for a clearer distinction between news and opinion content online, the Journal reported, saying the opinion section’s “lack of fact-checking and transparency, and its apparent disregard for evidence” is undermining reader trust.
  4. The Los Angeles Times reported that celebrities including Jennifer Aniston and John Oliver are promoting mask-wearing and fighting conspiracy theories about COVID-19.
  5. Are you a reporter covering QAnon? Here are some tips from Susan for newsroom discussions on how to handle politicians who promote conspiracy theories.

That’s it for this week! Feel free to send feedback and suggestions to factually@poynter.org. And if this newsletter was forwarded to you, or if you’re reading it on the web, you can subscribe here. Thanks for reading.

Harrison and Susan