A few weeks after the publication of a comprehensive study on how misinformation spread on Twitter during the French election, more research emerged from the United States.
On Thursday, the Knight Foundation released a study analyzing the role of Twitter in spreading links to fake news stories and conspiracies during and after the 2016 United States presidential election. The report, which was commissioned by Knight and produced by George Washington University and Graphika, a firm that maps social media interactions, looked at more than 10 million tweets from about 700,000 accounts linking to more than 600 outlets that publish misinformation.
What it found during the campaign isn’t too surprising: 6.6 million tweets linked to fake news or conspiracy theories in the month before the election. But after the election — when Twitter took a few actions to counter the spread of falsities — misinformation continued to prosper, albeit at lower numbers.
According to the report, about 4 million tweets linked to fake news or conspiracy sites between March and April 2017. And many of those tweets are coming from the same accounts that were publishing during the election; the Knight report found that 80 percent of the accounts it identified were still active and publishing more than 1 million tweets per day as of publication, including 90 of the 100 most-active accounts.
“Right now, the discussion about misinformation online is based on anxiety and conventional wisdom. That’s not good enough,” said Sam Gill, vice president for communities and impact at Knight, in a press release. “What we need is hard research on the complexity and scale of the issue. This report is one step in that process.”
RELATED ARTICLE: Here’s what the spread of misinformation on Twitter looks like
The study also found that just a handful of sources published the majority of misinforming content. Sixty-five percent of the links researchers identified were from the same 10 sites — a trend that stayed more or less stable after the election. And most of the accounts analyzed show evidence of benefiting from bots.
To decide which outlets qualify as fake news, which the report defines as “content that has the appearance of credible news stories, but without going through the process of verification that makes real news valuable,” researchers drew from an open-source list of sites from OpenSources labeled as either “fake” or “conspiracy.” They then compared that list to other publicly available repositories.
“A site listed as fake or conspiracy news in the OpenSources database is nearly always categorized that way in other public lists,” the report reads. “Moreover, when comparing various lists of sites judged to be fake news by reputable organizations, there was little disagreement on the sites that multiple entities had investigated.”
The study’s findings come with some caveats. First, mainstream news organizations still publish far more Twitter links than misinforming sources. Second, the report did not analyze Twitter interactions, so it doesn’t take into account the post-level reach of accounts — just their distribution and following. Third, some organizations on the OpenSources list are more appropriately categorized as hyperpartisan sites rather than fake news sites.
While the Knight findings are similar to the French report on Twitter misinformation, there are some notable distinctions. The latter, which looked at 60 million exchanges from more than 2.4 million users, found that less than 0.01 percent of the posts analyzed linked to fake news sites identified by Le Monde’s Décodex database. That’s because, unlike the Knight study, it compared the proportion of misinforming tweets to all tweets sent during the sample period.
However, both seem to confirm that partisanship plays a key role in the dissemination of misinformation, Adrien Sénécat said — and it comes from all political sides.
“Beyond what can be called ‘;false information,’ the core of the problem would be bias, polarization and misinformation,” the Les Décodeurs fact-checker, who covered the French report, told Poynter in an email. “I also think it is important to underline that both camps (Republicans and Democrats) spread misinformation and contributed to the climate of hyperpolarization during the election, which lowers the quality of the public debate.”
Giovanni Luca Ciampaglia, an assistant professor of computer science at the University of South Florida who has studied misinformation networks on Twitter, told Poynter in an email that — beyond the report’s findings — it reinforces the importance of more collaboration between researchers and the platforms.
“These are compelling findings that contribute to the growing body of evidence about manipulation of social media,” he said. “It is especially welcome that these results come from external researchers. This reinforces the case for more collaboration between academic researchers and platforms, in the interest of a more trustworthy and reliable information ecosystem.”