March 9, 2018

At a technology summit in Brussels, Maarten Schenk had a harebrained idea.

“We were brainstorming and thinking about how to reach people who like and share fake news online,” said Schenk, who runs the Lead Stories debunking site in Belgium. “Fake news sites don’t have any trouble reaching them, because your crazy uncle on Facebook always comes up with new fake news sites that you’ve never heard of. So we thought, ‘What if we try to mimic their tactics and see if we can beat the enemy with their own weapons?’”

On the second day of TechCamp, a workshop series sponsored by the United States Department of State’s Bureau of International Information Programs, other participants pitched in and helped build out the idea over the course of an afternoon. Some people wrote clickbait headlines for articles on how to discern fake news on Facebook, while others designed the basic WordPress template. Schenk, a former computer programmer, handled the technical side.

The group shared the stories on their own timelines, as well as in a few conservative groups. By the time it was their turn for a presentation, they had already reached hundreds of people online without using ads or paid promotion.

“We thought maybe this was something,” Schenk said.

So did the State Department.

After refining the idea during another TechCamp in Warsaw, the project — titled Forbidden Facts and run by Schenk, Utrecht University lecturer Jordy Nijenhuis and Olga Yurkova from StopFake.org — applied to the series’ Small Grants Program. In January, it was awarded $5,000 to run the site’s hosting and social promotion for six months, with a mid-year report to follow.

While the goal of many fact-checkers is to correct misinformation where it occurs, Schenk said Forbidden Facts aims to reach out to more skeptical social media users.

“Of course we make no illusion that we think we’re going to change or fix fake news,” Schenk said. “We think it’s an interesting idea to reach people who would never trust Snopes or never trust Lead Stories because we’re evil.”

The site, which now has a little more than 3,000 page views after about a month, works by publishing clickbait or conspiracy-esque headlines on Facebook. Then, they link to a story with an equally incendiary lede that then leads into an explainer on how fake news spreads online.

“This article is a lot of bogus,” one explainer reads. “We wrote this article to show you how easy it is to make people believe in wild stories. You probably clicked on this link because you thought it was funny, it made you angry or sparked your interest.”

The hope is that such stories trick fake news consumers into learning how to be a more discerning reader.

“We’re trying to teach people something about how fake news works and how to recognize it,” Schenk said. “It’s like a toddler who gets burned on the stove won’t get burned a second time … we hope that’s a stronger message than another article about how to recognize fake news online.”

Obviously writing sensationalist headlines raises all sorts of ethical concerns. How do you avoid simply publishing fake news that gets picked up by real misinformation sites?

At the TechCamp in Warsaw, Schenk’s project group came up with a set of guidelines to address those questions. The group isn’t using advertising for the site or Facebook page, and each article is designed to be clickbaity instead of accusatory, making it clear in the story that the goal was to trick readers into learning more about hyperpartisan content.

“We’re still using a conspiracy-like language and implying stuff that happened, but most of it happens in the imagination of the reader,” he said. “So if you’re already inclined to think that Trump did or said something, then you’ll want to click and read the rest and know what it’s all about.”

Of course, there’s the question of how many people actually click through to each story.

According to analytics Schenk sent to Poynter, Forbidden Facts’ Facebook reach was much larger than the number of people who actually clicked through to articles, suggesting that many users get duped without learning more about misinformation. Corrective information often comes far down in each story — sometimes after photos — and many Facebook comments don’t appear to be in on the joke.

The site’s stories have also already been picked up by at least one fake news site. From a media literacy perspective, that’s a problem.

“When we think about what actually happens … the major influence of these headlines aren’t the people who click through — the articles are almost irrelevant,” said Mike Caulfield, director of blended and networked learning at Washington State University at Vancouver. “Lord knows how many people actually read through to the second paragraph.”

Instead of teaching people about misinformation and seeing how effective their method is, Caulfield said Forbidden Facts could simply be exposing people to more false information on Facebook. When asked how the site is different from fake news sites that pose as satire, Schenk said they disclose intentions in the body of each text instead of appending it to the footer, which readers might not see.

“The end goal is also different: We try to teach our readers something by explicitly exposing and explaining our own 'tricks' each time (hoping they'll learn to recognize them on other sites),” he said.

But that method might be flawed. Caulfield pointed to a common elementary school lesson plan that teaches children how to find reliable sources by asking them to research a tree octopus. While well-intentioned, that kind of approach might not have much effect in the long run.

“A lot of the things we do in the classroom that work under non-emotional situations don’t always work in the real world, because of course people are really upset,” Caulfield said. “The other way is that the techniques of looking real are constantly evolving. This is kind of a never-ending war against the scammers.”

Another problem: Confronting people on social media may not be the best way to teach.

“People are more receptive to instruction at the point that they feel they need the instruction,” Caulfield said. “One of the problems in instructional design is that most people are under false conceptions don’t know that they’re false — they think their conception of the world is perfectly adequate. And as such, they aren’t really open to deconstructing their model of the universe and replacing it.”

Still, while the Forbidden Facts project may not be a breakthrough in media literacy, Schenk said that if only one person starts to doubt Facebook pages as a result, he’ll consider it a success.

“If people won’t trust us, that’s a good sign,” he said. “That’s exactly the sort of people you’re not trying to reach.”

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News