Fact checks couldn’t contain the virality of that altered Pelosi video. But that doesn’t mean we should give up on them.

Nancy Pelosi didn’t slur her words at an event in Washington, D.C., last week. But a video making it look like she did has been viewed millions of times on Facebook.

The video, which a hyperpartisan Facebook page posted last Thursday, slowed down and edited footage of the U.S. Speaker of the House of Representatives giving a speech at the Center for American Progress. The Washington Post reported that both President Donald Trump and his personal lawyer, Rudy Giuliani, had shared versions of the hoax on Twitter.

At least five fact-checking sites that partner with Facebook debunked the altered video in the days after it was first posted. Facebook appended those fact checks to the video and used them to decrease its future reach and notify people who had shared it. (Disclosure: Being a signatory of Poynter’s International Fact-Checking Network code of principles is a necessary condition for joining the project.)

But they were too late: The bogus Pelosi video had racked up more than 88,000 reactions, shares and comments as of publication. And the cumulative engagement of fact-checkers’ work was only about 19,000 reactions, shares and comments.

Below is a chart with all the fact checks in order of how many engagements they got on Facebook, according to data from BuzzSumo and CrowdTangle.

Those numbers are bleak, but perhaps not surprising.

Fact-checkers regularly struggle to contain the spread of false posts on Facebook — even when the platform manually decreases its reach in the News Feed. The reason why has been written to death: algorithms are designed to amplify sensational claims, fact checks take a lot of time to research and misinformation is relatively easy to spread.

But that doesn’t mean fact-checking on Facebook never works. While fact-checkers struggled to contain the virality of the bogus Pelosi video on Facebook, others around the world had better lucking amassing more reach than the falsehoods they debunked.

One example is this Friday fact check from Teyit, a fact-checking site based in Turkey. It debunked a political meme on Facebook that took a photo of a politician drinking water out of context to spread a false narrative that he was violating the Muslim holy month of Ramadan.

That article had racked up nearly 30,000 reactions, shares and comments as of publication, according to BuzzSumo. For comparison, the bogus meme got only about 1,200 engagements.

And Teyit’s fact check wasn’t the only one to get tens of thousands of engagements this week.

In an article published Friday, Factcheck.org debunked a photo posted on Facebook that claimed to show one of Trump’s campaign rallies. In fact, it depicts a crowd at the Woodstock music festival in 1969.

Factcheck.org’s story amassed more than 23,000 reactions, shares and comments as of publication, compared to the out-of-context photo’s 2,900.

Below is a chart with other top fact checks since last Tuesday in order of how many engagements they got on Facebook. Read more about our methodology here.

Since it was manually edited to be misleading, the Pelosi video hoax is obviously different from those kinds of Facebook hoaxes. And fact checks that surpass the reach of the hoaxes may be less common than the reverse.

But these are still important examples of how Facebook’s fact-checking project can’t be written off as a waste of time. And they back up what fact-checkers and researchers have said about fact-checking’s effect on misinformation.

In an anonymous survey in December of 19 fact-checkers working with the tech giant, most fact-checkers said they thought their work was at least somewhat reducing the reach of hoaxes on Facebook. While those same fact-checkers asked for more data on how their work is affecting misinformation at scale, the piecemeal examples that Poynter has documented and other research suggest that their hunch has some merit.

This week, CNN published an article authored by two academics whose recent research shows that fact-checking could prevent misinformation from shaping people’s thoughts. Jeremy Cone, an assistant professor of psychology at Williams College, and Melissa Ferguson, senior associate dean of social sciences at Cornell University, conducted seven experiments with more than 3,100 participants.

What they found is that, when people are presented with new, credible information that refutes their preconceived notions about a person, they tend to change their thoughts about them. That finding is similar to other studies about fact-checking and misinformation.

And, while all of this evidence is limited in one way or another, it suggests that fact-checking can work on Facebook — even when some false posts go viral. It just needs to get faster.

“Tech companies struggling with how to respond to misinformation should support and value their fact-checkers,” Cone and Ferguson wrote. “They may be the cyber pillars we need to resurrect our democracy.”