Misinformation and claims of censorship
The major social media platforms aren’t always in lockstep on what content they moderate. But this week, Twitter, Facebook and YouTube were all on the same page in blocking a video of a group called “America’s Frontline Doctors” touting the anti-malaria drug hydroxychloroquine as a cure for COVID-19, contrary to scientific evidence. One of the doctors said “you don’t need masks” to halt the spread of the virus.
By now, the story of the video is well known — the retweets by President Donald Trump and his son, the fact-checks that followed, and the bizarre beliefs of one of the doctors involved, Stella Immanuel.
What happened in the days after that, though, is key in understanding the methods and tactics of people who push unproven “cures” and other falsehoods and then have their content blocked: The blocking itself and the claims of censorship that follow become part of the attempt to get attention.
The day after the video of their Washington press conference was removed, the white-coated doctors were out again talking about the same messages, but with an added angle: They were being silenced.
“We’re coming after you Big Tech, we’re coming after you,” said Simone Gold, one of the doctors leading the effort. “We won’t be silenced,”
The “censorship” message then took off among the doctors’ supporters on Twitter and other platforms.
This is a common tactic among groups that champion unconventional messages. The censorship claim becomes central to their efforts to control the narrative, said Aimee Rinehart, U.S. deputy director of the nonprofit organization First Draft, which fights disinformation.
Cries that “Big Tech is censoring us!’” become part of the attention grab, she said, even though the platforms are clear that they will only remove content that spreads false information about the coronavirus or messages that suppress the vote.
The doctors’ events were also held the same week that the CEOs of Amazon, Google, Facebook and Apple (Twitter was not among them) were testifying before a House subcommittee, which is probing the power of the tech companies. So it was convenient timing for the doctors, since there was a good chance that the platforms’ decision to take down the video would come up in the hearing, and it did.
In short, the doctors were successful in inserting their cause into the hearing, in effect, using the platforms’ content moderation decision to extend what might otherwise have been written off as a one-news-cycle fringe event.
– Susan Benkelman, API
. . . technology
- BuzzFeed News reported on internal discord at Facebook over the company’s approach to some of President Trump’s divisive posts about the George Floyd protests.
- Facebook employees who spoke to BuzzFeed cited a “lack of consistency and poor communication around enforcement of its community standards as a key frustration.”
- When a social media platform takes steps to block or limit the reach of the conspiracy theory QAnon, “it only attacks one part of the problem,” wrote the MIT Technology Review’s Abby Ohlheiser. The platforms, she wrote, need a multi-faceted approach to deal with what one expert called an “omniconspiracy.”
- Among the people she interviewed was Steven Hassan, a mental health counselor and former cult member who said people need to be educated on how and when they are being manipulated online.
. . . politics
- The New York Times reported that newly declassified intelligence shows how Russian intelligence services are spreading disinformation about COVID-19, including propaganda from China that the virus was created by the United States.
- “The disinformation efforts are a refinement of what Russia tried to do in 2016,” wrote the Times’ Julian E. Barnes and David E. Sanger.
- The New York Times’ Anton Troianovski reported on the controversy over Ukrainian fact-checking organization StopFake’s alleged far-right ties.
- Ukrainian news outlet Zaborona cited StopFake’s Marko Suprun’s attendance at a 2017 Youth Nationalist Conference.
- StopFake is an International Fact-Checking Network Code of Principles signatory, and IFCN Director Baybars Örsek said it will conduct an assessment of the fact-checking organization’s compliance.
. . . science and health
- The Guardian reported on what it called a tsunami of fake coronavirus cures and treatments throughout Latin America, with conspiracies and hoaxes that range “from the bizarre to the ridiculous.”
- “Many of the false claims include miracle Covid-19 cures including Peruvian sea water, Venezuelan lemongrass and elderberry tea and supernatural seeds being hawked by one Brazilian televangelist,” the report said.
- The Sinclair Broadcast Group decided to pull a segment it had planned with an interview of a scientist featured in the “Plandemic” video, which contains a number of baseless theories about the COVID-19 pandemic.
Chloroquine has been shown to be ineffective at treating COVID-19 according to studies by both the World Health Organization and U.S. Centers for Disease Control and Prevention. Ivermectin, a medicine used to treat heartworm in animals and roundworm in humans, has shown some promise in early studies to treat COVID-19, but has not been properly vetted and approved to treat the disease.
Both fact-checkers talked to experts who explained both chloroquine and ivermectin are created through combining other chemicals in laboratory settings. They do not exist in citrus fruit peels. Both also noted misinformation about using citrus to treat COVID-19 is not new, and put this latest hoax in that context.
What we liked: This is a unique fact-check that builds on the work fact-checkers have been doing throughout the infodemic. It reiterates the current scientific understanding about the efficacy of chloroquine, and recognizes the trope of citrus fruits being used to treat COVID-19. This falsehood is a combination of those two narratives, and Aos Fatos and Agência Lupa unpack that for their readers.
– Harrison Mantas, IFCN
- In a status report on its automated fact-checking project, the Tech & Check Cooperative, Duke Reporting Lab’s Bill Adair and Mark Stencel said in a piece for Nieman Lab they’ve made progress, but have concluded that human help is still necessary.
- Anti-masking groups in Canada are adopting techniques from and even joining forces with people in anti-vaccine movements, the CBC’s Nicole Ireland reported.
- Stat News, the health and medicine site, explored Facebook’s difficulties policing vaccine misinformation on the platform, calling the situation “dire.”
- Sen. David Perdue (R-Ga.) deleted a Facebook ad that showed a photo of his Democratic challenger, Jon Ossoff, who is Jewish, that had been manipulated to make Ossoff’s nose appear bigger. The incumbent’s campaign told The Forward that the distortion was accidental.
- Gulin Cavus, editor-in-chief of Turkish fact-checking organization Teyit, spoke to NPR about that country’s new law tightening controls on social media.
That’s it for this week! Feel free to send feedback and suggestions to firstname.lastname@example.org. And if this newsletter was forwarded to you, or if you’re reading it on the web, you can subscribe here. Thanks for reading.