March 7, 2019

Factually is a newsletter about fact-checking and accountability journalism, from Poynter’s International Fact-Checking Network & the American Press Institute’s Accountability Project. Sign up here.

Anti-vaccine content draws scrutiny

Social media platforms’ role in fueling anti-vaccination sentiment is drawing continued scrutiny in light of a surge in measles outbreaks around the world.

It is a particular concern in the United States, where lawmakers have been holding hearings on the issue, including one Tuesday in which Ethan Lindenberger, an 18-year-old son of an anti-vaxxer, was the star witness testifying before a Senate committee.

Lindenberger didn’t contract a preventable disease, fortunately, but his story was compelling: He questioned his own mother’s anti-vaccination beliefs, did research and ended up getting his shots after he turned 18, when he no longer needed her authorization. He put the blame for what he called his mother’s well-intentioned decisions on “deeply rooted misinformation” online, which, he told lawmakers, should be at the forefront of the vaccine debate.

Senators seemed to be on the same page.

“Charlatans and internet fraudsters who claim that vaccines aren’t safe are preying on the unfounded fears and daily struggles of parents, and they are creating a public health hazard that is entirely preventable,” said Sen. Lamar Alexander (R-Tenn.), who chairs the panel that held the hearing.

The question is what Washington will — and can — do about it. Earlier this month, Rep. Adam Schiff (D-Calif.) wrote to the CEOs of Facebook and Google seeking information about steps they are taking “to address this growing problem.” He did not, however, imply a threat of action.

Lawmakers’ options may be limited given First Amendment protections. Congress could try to chip away at a federal law that offers platforms protections from liability for content that third parties post on their sites, but that would almost certainly generate epic legal challenges.

And it’s not even clear that social media is substantially responsible for increasing vaccine hesitancy, University of Michigan public policy professor Brendan Nyhan wrote in The New York Times on Wednesday. He suggested that state vaccination exemption policies might be a more appropriate target.

In the meantime, tech companies are making moves on their own.

CNN reported last week that Facebook is working with public health experts to find ways to make anti-vaccine posts “less prominent.” YouTube is preventing ads on anti-vaccination videos. Amazon removed anti-vaccine documentaries from its Prime video service after a letter from Schiff. Pinterest is deliberately manipulating its search function to turn up nothing when a user looks for content on vaccinations.

It’s hard to know whether these actions are a response to political pressure alone. But at a time when lawmakers are examining privacy laws, antitrust issues and the companies’ role in disseminating political disinformation in advance of the 2020 election, such moves could help the industry stave off attempts to regulate content, at least for now.



  • The European Commission called for the debunking of myths and misinformation about migration, as it reported on a decline in migration flows. Here’s its fact sheet addressing what it says are fake news and untruths. At the same time, the EU said the platforms still aren’t doing enough to combat disinformation.
  • Benjamin T. Decker, a research fellow at Harvard University, traced the origins of a meme questioning Sen. Kamala Harris’ (D-Calif.) identity. Here’s what he found. In the same vein, The New Republic published a dissection of how critics of Rep. Alexandria Ocasio-Cortez (D-N.Y.) have perpetuated the false idea that she wants to ban meat.
  • In Thailand, the deputy leader of an up-and-coming political party could be among the first people charged under the country’s new anti-misinformation law. The legislation, which scapegoats “fake news” as a way to regulate the media, is an expansion of a 2007 law that punishes anti-government criticism.

…the future of news

  • Thirty experts and 40 volunteers helped Buenos Aires-based Chequeado live fact-check a presidential speech last Friday. Fact-checkers also got an assist from Chequeabot, the automated fact-checking tool that leverages natural language processing and machine learning to match new claims with past fact checks.
  • “Deepfake propaganda is not a real problem,” reads this headlinefrom The Verge. Since Motherboard first reported on the manipulated videos in 2017, no major threats have materialized online. Meanwhile, news organizations continue to raise the alarm about deepfakes.
  • Mark Zuckerberg said Wednesday that he’s planning to make Facebook more private and encrypted. Our concern: That, depending on how this policy is carried out, it could make it harder for fact-checkers to find and debunk misinformation.

Each week, we analyze five of the top-performing fact checks on Facebook to see how their reach compared to the hoaxes they debunked. Here are this week’s numbers.

  1. “Obama Didn’t Give Iran ‘150 Billion in Cash’” (Fact: 26.5K engagements // Fake: 151.4K engagements)

  2. Full Fact: “UK taxpayers aren’t subsidising France’s 35-hour working week” (Fact: 2.1K engagements // Fake: 91.9K engagements)

  3. Les Décodeurs: “These ‘yellow vests of Bordeaux’ which are neither ‘yellow vests’ nor ‘of Bordeaux'” (Fact: 1.7K engagements // Fake: 5.9K engagements)

  4. PolitiFact: “No, undocumented immigrants don’t get Medicare for free” (Fact: 1.2K engagements // Fake: 15.3K engagements)

  5. Agência Lupa: “It was an adulterated photo of the birthday party of the son of Lionel Messi” (Fact: 799 engagements // Fake: 128.1K engagements)

Snopes investigated how activists behind a website called The Tennessee Star have “used the appearance of local newspapers to promote messages paid for or supported by outside or undisclosed interests.” The story published this week further details how these conservative activists have been expanding this effort to other states expected to be battlegrounds in the 2020 elections.

What we liked: This is an investigative piece as opposed to a straightforward fact check. But if you’re interested in how activism can be packaged as local journalism, this is a solid case study. It deconstructs the site’s content, analyzes the business structure and details the otherwise undisclosed political connections of the owners.

It also gives credit to where credit is due, rather than shying away from, the initial work that Politico did on the The Tennessee Star in April 2018, quoting its story liberally.

  1. Infamous hoaxer Christopher Blair has created another satirical version of Snopes, which often debunks his stories on Facebook. Their feud continues.
  2. Exciting news: The IFCN is hiring its first-ever summer intern! The program lasts 10 weeks and pays $10,000. Apply by March 30!
  3. Also exciting: Poynter’s MediaWise project is hiring a multimedia reporter to teach journalism skills and debunk misinformation on social media.
  4. Peter Cunliffe-Jones has stepped down as executive director of Africa Check.  Current deputy director Noko Makgato will replace him.
  5. In an article entitled “Anyone Can Get Trolled — Even The New Yorker,” HuffPost profiles Jonathan Lee Riches, describing him as “an underground menace long before Gamergate and the alt-right.”
  6. Snopes has flagged it, India’s Boom has flagged it, and now, says PolitiFact, a posting that falsely portrays Syrian children killed in the 2013 gas attack as victims of organ trafficking in Asia is has resurfaced once again on Facebook.
  7. Want to learn how to use WhatsApp to debunk misinformation? Join Poynter for a one-hour webinar on March 20.
  8. In India, Boom Live investigated how several mainstream media outlets ran a story about a nonexistent wing commander based on social media rumors. It was part of an ongoing misinformation crisis between India and Pakistan.
  9. First Draft is hosting three identical summits on how to combat misinformation in the EU.
  10. Russian lawmakers want to give the government greater control of Internet content by creating what the Los Angeles Times says is a “sovereign network that the Kremlin could shut off from the greater World Wide Web.”

That’s it for this week. Feel free to send feedback and suggestions to

Daniel and Susan

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News