More than six months after announcing a project to combat fake news, Facebook released an update today.
The Facebook Journalism Project — a three-pronged approach aimed at providing training, products and tools for journalists and news consumers — was created as a way to solidify relationships with media organizations, which often hold contradictory feelings about the social network. One key part of the initiative is curbing the spread of false or inaccurate news on the platform (Disclosure: Poynter's International Fact-Checking Network plays a role in verifying third-party fact checkers).
So, how much has the social media giant done to combat fake news?
According to the update, quite a bit. The steps Facebook has taken to dissuade the publication of false or inaccurate news include limiting spam or fake news publishers' access to buying ads, reducing posts that link to "low-quality web page experiences," introducing ranking updates to limit the spread of false news or clickbait and testing new ways for people to report a false story. (Read about the rest of Facebook's ongoing initiatives here).
"We know people want to see accurate information and high-quality news on Facebook," said Head of News Partnerships Campbell Brown and Vice President of Product Fidji Simo in the Thursday blog post. "We want to empower people to identify misleading news content when they encounter it — on any platform."
Another interesting step in Facebook's battle against fake news is the company's move to disable non-publisher Pages' ability to modify link previews, a feature it says has been abused by people looking to spread false or inaccurate stories. News organizations can still overwrite link metadata, such as the headline or description, using a new tab under "Page Publishing Tools," according to a Tuesday blog post.
Despite its moves to combat the spread of fake news, Facebook has come under fire in recent months for what some say has been a rocky effort. The Guardian reported in May that the tools the social network has rolled out have been largely ineffective at limiting disinformation online. Reporter Sam Levin wrote:
A Guardian review of false news articles and interviews with fact-checkers and writers who produce fake content suggests that Facebook’s highly promoted initiatives are regularly ineffective, and in some cases appear to be having minimal impact. Articles formally debunked by Facebook’s fact-checking partners – including the Associated Press, Snopes, ABC News and PolitiFact – frequently remain on the site without the 'disputed' tag warning users about the content. And when fake news stories do get branded as potentially false, the label often comes after the story has already gone viral and the damage has been done.
Additionally, Facebook has yet to share any real data about its fact-checking initiatives or the amount of fake news on the platform. In November, CEO Mark Zuckerberg wrote that, of all the content on Facebook, "more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes." Poynter reported that claim had to be based on internal research and not publicly available data.
The recent changes to Facebook come amid a flurry of modifications to its publishing tools, such as the addition of a tool that allows publishers to compare their mobile and Instant Articles traffic. Other updates in the six-month review of the Facebook Journalism Project include several improvements to Instant Articles — such as call-to-action buttons, subscriptions and ad support — updates to the Facebook-owned audience-tracking tool Crowdtangle and new partnerships with the Knight-Lenfest Newsroom Initiative and the First Draft Network.
"We’ve learned so much since we launched the Facebook Journalism Project and this collaboration is already driving innovation that we couldn’t have achieved on our own," Brown and Simo wrote in the Thursday blog post. "It’s going to take a concerted effort on all of our parts to help build a future where quality journalism can thrive."