At New York event, Facebook stays tight-lipped on fake news
NEW YORK — At a City University of New York event to discuss Facebook's efforts to fight misinformation, the company followed a familiar playbook of talking about its News Feed initiatives broadly, without offering great insight into their effect.
Adam Mosseri, head of Facebook’s News Feed, visited the CUNY Graduate School of Journalism this morning for a fireside chat on how the tech giant is working to improve its primary product. Moderated by CUNY professor Jeff Jarvis, the event — which filled an entire room with journalists from places such as USA Today and NBC News — was partially billed as a discussion on how Facebook is tackling its longstanding fake news problem.
“We have been eager to get data on how (the fake news flag) works,” said Jarvis, who oversees CUNY’s News Integrity Initiative, which Facebook partially funds. “What can you tell us about the impact of the flag — especially on user behavior?”
“One really important thing to understand is the third-party fact-checking program gets a lot of attention because it’s the most tangible part of what we do to combat false news,” Mosseri responded, “but it is actually a small part of a much larger effort.”
He then launched into an explanation of the different ways Facebook addresses misinformation, including by disrupting financial incentives for fake news sources, better identifying which pages are deliberately misleading and adding features to give users more context about specific pieces of content.
The exchange was almost identical to one Poynter documented in April, at the International Journalism Festival. In Italy, Mosseri told Jarvis that “the third-party program is important, but it’s only one part of a larger system.”
At CUNY, Jarvis followed up.
“Do you have any data at all about the impact of fact-checking claims on users?” he asked.
Mosseri responded by saying Facebook has started to share more data with its fact-checking partners (which the company requires to be signatories of the International Fact-Checking Network Code of Principles). It was a nod to a story BuzzFeed’s Craig Silverman published earlier this month that reported Facebook’s disputed tag and algorithm adjustment for fake news stories help reduce their reach by about 80 percent over an average of three days. The article was based on a leaked email from Facebook to its fact-checking partners.
Tucker Bounds, a spokesman for Facebook, told Poynter earlier this month that the company is planning to share another update on the project with its partners before the end of the year.
“Moving into 2018, we’d like to provide more regularly communication and news updates to our partners,” he said. “Whether they come in the form of emails or phone calls or direct meetings, it’s difficult to say. It is something we want to do better.”
After the event, Poynter asked Mosseri to clarify if Facebook had any plans to share data more publicly in the future. He said there isn’t anything on the books, but they’re “always looking into it” — a point a Facebook spokesperson made to Poynter earlier this month.
“Latency is something that is a reasonable concern for anyone involved in the complex world of fact-checking and trying to combat the false news phenomena,” the spokesperson said. “Facebook has to guard a number of concerns when it’s talking about data that could potentially compromise the security of our platform or the privacy of the people who use Facebook.”
Today’s discussion at CUNY comes amid a week of controversy for Facebook, which faced backlash Monday for testing two separate feeds — one for posts from users’ friends and one for content from publishers — in Sri Lanka, Bolivia, Slovakia, Serbia, Guatemala and Cambodia. Mosseri said that Facebook is very unlikely to roll it out to other markets as-is, and that the company could have handled the situation better.
“I think people underestimate how many times we test,” he said. “We could have communicated it better.”
Later, a journalist from NBC asked Mosseri how Facebook could be more transparent with its publishing partners about changes it makes to the News Feed. He responded by saying while small shifts in post engagement can be explained by any number of factors, such as time, date and post volume, Facebook is looking for better ways to communicate bigger changes to publishers.
But he said the company will probably not let news organizations know each time it’s testing something new on the News Feed.
“It would actually cause more harm than good,” Mosseri said. “Most don’t get launched, so I worry about scaring a bunch of people about things that are never going to actually happen.”
In addition to addressing the two-feed testing debacle that consumed media news Monday, the newest information Facebook shared at the event was tangential to its work in countering online misinformation — and central to its effort to share best practices with publishers.
At CUNY, Mosseri announced a new, extended series of guidelines for publishers on how to use Facebook effectively, which include tips like “create content that your audience would find interesting” and “consider using CrowdTangle.” The document builds off the company’s past efforts to distribute best practices to news organizations and is broken into three different principles of the kind of content that Facebook users value.
Mosseri said the guidelines — which detail how the News Feed penalizes clickbait, low-quality landing pages, fake news and cloaking — are aimed at being more transparent with publishers about how Facebook works.
"Most of this is targeted at bad actors. It's important to understand what we're doing and why so you don't get cut in the crosshairs,” he said. “This is one step in a longer path that we’re trying to embark on to strengthen our relationship with the industry by being more clear about what works and what doesn’t work on our platform.”