Can Washington lead the war on disinformation?

August 2, 2019
Category: Uncategorized

Factually is a newsletter about fact-checking and accountability journalism, from Poynter’s International Fact-Checking Network & the American Press Institute’s Accountability Project. Sign up here.

Who leads the U.S. ‘war’ on disinformation?

When former U.S. Special Counsel Robert Mueller testified before the House Intelligence Committee last week about his investigation into Russian interference in the 2016 presidential election, some saw his comments about Moscow’s ongoing meddling attempts as the most important statement of the day.

“It wasn’t a single attempt,” he said when asked about the spread of disinformation and whether Moscow would replicate the efforts again. “They’re doing it as we sit here and they expect to do it during the next campaign.”

Now we also know from reporting by The Washington Post that it’s not just the Russians. Disinformation is coming from Iran as well as other foreign actors seeking to sow chaos and uncertainty among U.S. voters. Said The Post, “A short list of countries that host online influence operations with a history of interfering across borders includes Saudi Arabia, Israel, China, the United Arab Emirates and Venezuela, researchers say.”

Washington has a long tendency to rhetorically militarize national problems that need tackling, from the war on poverty to the war on drugs. But the need for a “war on disinformation” is not hyperbole, given the warnings of U.S. intelligence agencies that adversaries want to destabilize the U.S. information ecosystem.

It’s not clear, however, who can or will lead the charge in this conflict. Even as experts say the problem is worsening, it is unlikely that the current divided government could produce anything close to a solution.

The Wall Street Journal reported this week on a range of efforts in business, government research agencies and academia on ways to combat disinformation like deepfakes. That includes companies trying to come up with new technologies to fight deepfake videos and doctored photos with verification technology for cell phones. There is also research by the Defense Department into forensic technology that could automatically detect manipulations.

What was striking in the Journal’s account was the sense of urgency among the companies interviewed, compared with the rhetoric coming from lawmakers. The Journal’s reporter, Abigail Summerville, quoted one CEO as predicting that “visually undetectable deepfakes” will appear in less than 12 months. More ominously, he said: “Society is going to start distrusting every piece of content they see.”

Some lawmakers have voiced alarm, too, and have proposed legislation to criminalize the creation and distribution of deepfakes. But mostly they have held hearings that reveal deep divisions over who to blame and who is even the enemy in this war. President Donald Trump, meanwhile, is attacking Big Tech and accusing the media of being enemies of the people. He’s also shown a reluctance to acknowledge that the 2016 result might have been influenced by outsiders.

Some people are trying. A pair of Washington consultants that includes the retired Army Gen. Stanley McChrystal is suggesting a special commissionto look at the issue in a comprehensive way. Such blue-ribbon panels are usually created to study complex problems, like the deficit, for which Congress can’t find a compromise. Their recommendations are rarely adopted, but they can create momentum if the problem is serious enough. Congress did, for example, accept recommendations from a commission created after the 9/11 attacks.

But even if that were to happen, it would almost certainly not bear fruit in time for 2020.

. . . technology

  • Six months after it began its fact-checking collaboration with Facebook, Full Fact has issued a report on what’s working and what’s not. The report included suggestions to improve Facebook’s third party fact-checking program. Here’s the IFCN’s take, which noted wide news coverage of the report, including a story on the Times of London’s front page.
  • Speaking of Facebook, it is now seeking to read our minds, The Verge’s Adi Robertson reported. “Their work demonstrates a method of quickly ‘reading’ whole words and phrases from the brain — getting Facebook slightly closer to its dream of a noninvasive thought-typing system,” his report said.
  • The New York Times reported that Nigerian scammers are using Facebook to impersonate members of the U.S. military and con people into sending them gifts.

. . . politics

  • Fact-checkers in Uruguay, Bolivia, Argentina and Brazil have come together to form national coalitions and fight mis/disinformation in teams. Facebook posts and messages spread via WhatsApp are their main focus, as many of these nations are facing elections soon. The IFCN’s Daniela Flamini and Cristina Tardáguila’s report is here.

  • Daniela also published a story this week about the times in which politicians around the world have floated conspiracy theories to score points with voters.

  • How will the British press cover the new prime minister, Boris Johnson? Cristina explains in this post. The tactics include a call by London’s Channel 4 news for the public to let them know when they see questionable claims.

. . . the future of news

  • The New York Times this week explained its fact-checking operationahead of the 2020 election, saying it intends to “shine a light on disinformation” in multiple ways, including by “bringing in journalists who specialize in spotting such material.”

  • Why do people fall for conspiracy theories? BuzzFeed’s Craig Silverman explains in this video that a person’s belief system is more powerful than a list of facts. And he gives tips for dealing with “Aunt Janet” who might be falling down a rabbit hole.

  • A joint report from Nigeria’s Centre for Democracy and Development and the University of Birmingham (UK) on WhatsApp usage during the Nigerian elections comes to conclusions that “are both troubling as well as encouraging,” The Conversation reported.

A lot of online misinformation tries to get people to share or click something either to make a political point or make advertising money. But some hoaxers are also trying to collect users’ personal information.

Such was the case with a rumor that Indian fact-checking outlet Boom Live debunked this week. The hoax, which circulated on WhatsApp and Facebook, claimed that the government of Prime Minister Narendra Modi is distributing free solar panels to people who filled out an online form.

That’s bogus, Boom reported — no such government initiative exists.

The form is hosted on a suspicious-looking Blogspot domain and collected personal details like names and phone numbers as part of a phishing scheme. When users submit their information, a message popped up saying that they must forward the link to 10 other WhatsApp groups.

What we liked: Boom addressed a scheme that tried to collect people’s personal information by manipulating the structure of social media platforms like WhatsApp. The fact-checker did a good job of reminding readers that, while some misinformation is relatively harmless, good old-fashioned phishing attempts are still rampant online.

  1. President Trump has amplified Twitter accounts linked to the conspiracy group QAnon, The Washington Post reported.

  2. Tik Tok is being used to spread a hoax about human trafficking.

  3. Want to learn more about how PolitiFact is covering misinformation? Tune into a Reddit AMA with Daniel tomorrow at noon Eastern time!

  4. Digital marketers are gaming search results by buying expired domains that news organizations have linked to.

  5. Axios reported that American political groups have been creating local partisan websites that look like they’re publishing straight news.

  6. Research from NewsGuard found that 10% of the news sites Americans read publish health misinformation.

  7. Snopes has been known to fact-check websites that publish satirical content, claiming they’re spreading misinformation. Now one of those sites is pushing back.

  8. It’s the last week to apply for the IFCN’s fellowship program. Apply now for a shot at winning $2,500!

  9. CBC reported that fake newspaper websites are continuing to pop up in Quebec — two years after they were exposed by another media outlet.

  10. Taiwan has prosecuted more than 110 people under its anti-misinformation law since authorities started cracking down on false posts online in December.

Finally, Poynter’s daily media newsletter has a new look and a new name: The Poynter Report with Tom Jones. It’s aimed at both news consumers and creators. Reading it only takes five minutes, and you’ll finish more informed, invested and even inspired. Subscribe here to get The Poynter Report in your inbox every weekday morning.

That’s it for this week. Feel free to send feedback and suggestions to dfunke@poynter.org or susan.benkelman@pressinstitute.org.