April 27, 2018

Thursday the European Union published the culmination of months of work to come up with solutions to misinformation. And, while limited, experts say it’s a good start.

“It was a very difficult document to come up with because the group was made up very different agents with different interests — some of them conflicted,” Clara Jiménez Cruz told Poynter on Tuesday before the communication’s release. “The document came out kind of wary, but still it’s a great achievement.”

Cruz, a journalist at the fact-checking TV show El Objetivo and the co-founder of debunking site Maldito Bulo, was one of the members of the EU’s high-level group on misinformation, which was announced in November to help decide what should be done about fake news in Europe. The Eurobarometer survey found 80 percent of people have come across false or misleading information several times a month.

Between January and March, the group — composed of journalists, researchers and technology companies — met several times in Brussels to come up with a final report for the EU commission. (Disclosure: The International Fact-Checking Network was represented in the group.) That report’s key points included enhancing the transparency of online news, promoting media literacy, developing tools to help users and journalists tackle disinformation and promoting research on the impact of disinformation in Europe.

And those recommendations — as well as 2,986 replies to a public consultation — are reflected in the Commission’s final communication, which first lays out the scope of the problem and then dives into specific action areas. These are:

  1. Online platforms to act swiftly and effectively to protect users from disinformation
  2. Strengthening fact-checking, collective knowledge, and monitoring capacity of disinformation
  3. Fostering online accountability
  4. Harnessing new technologies
  5. Securing election processes
  6. Fostering education and media literacy
  7. Supporting quality journalism
  8. Countering internal and external disinformation threats through strategic communication

No regulation — yet

Key among the points in the communication, which has less power than a binding action, is that it does not call for any hard regulatory action against misinformation.

Rasmus Kleis Nielsen, director of research at the Reuters Institute for the Study of Journalism, said that, along with the fact that it was agreed upon by a variety of different stakeholders, is important.

“Clearly anchoring the communication in the primary importance of freedom of the press and emphasizing the need for a collaborative, multi-stakeholder approach to disinformation problems is the most likely to be expected and the least likely to have unexpected consequences,” he told Poynter before the communication’s publication. “I think that’s a really important value.”

In the communication, the Commission draws a distinction between illegal content, such as hate speech in many European countries, and potentially harmful content that should not be regulated against. That’s a somewhat advanced view of the problem, as many countries around the world have conflated the two into draft bills and laws.

“Legal content, albeit allegedly harmful content, is generally protected by freedom of expression and needs to be addressed differently than illegal content, where removal of the content itself may be justified,” the communication reads.

Calling on platforms to do more

The communication also implores tech platforms like Facebook, Google and Twitter — all of which were represented in the high-level group — to do more in the fight against online misinformation. It supports the creation of a code of practice, by which tech companies are committed to some of the following things:

  1. Improving scrutiny of advertisements in order to reduce revenues for purveyors of disinformation
  2. Being more transparent about sponsored content by creating repositories where information such as the sponsor identity, money spent and targeting criteria used is provided
  3. Intensifying efforts to close fake accounts
  4. Include indicators of the trustworthiness based on objective criteria endorsed by media organizations
  5. Dilute the visibility of disinformation by improving the findability of trustworthy content
  6. Establish clear marking systems and rules for bots and ensure their activities cannot be confused with human interactions
  7. Provide fact-checkers and academics with access to data

“These platforms have so far failed to act proportionately, falling short of the challenge posed by disinformation and the manipulative use of platforms' infrastructures,” the report reads. “Some have taken limited initiatives to redress the spread of online disinformation, but only in a small number of countries and leaving out many users.”

Implicit in the commission’s language is Facebook’s fact-checking program, which enables fact-checkers to detect and debunk hoaxes on the platform — decreasing their reach in the News Feed, sending a notification to users who shared them and appending related fact checks. The tool has expanded rapidly over the past month and is now available in 11 countries (Disclosure: Being a signatory of the IFCN’s code of principles is a necessary condition for participation.)

Additionally, the commission is planning a multi-stakeholder forum on misinformation to be held in the coming weeks. The goal is to get media organizations, tech platforms and civil society groups to agree on an EU-wide code of practice on misinformation, to be published in July. If that doesn’t have measurable effects by October, the commission will consider regulations.

That move was a compromise for journalists, Cruz said.

“The media organizations wanted to make hard legislation for the platforms … We feel that we are very much legislated in the European Union — we have legislation that affects the information we publish,” she said. “If I’m honest with you we have to see how the European forums go. I think everything is going to depend on that.”

Elevating the work of fact-checkers

The communication also pays homage to fact-checkers, saying they are “an integral element in the media value chain.” It points to standards like the IFCN’s code of principles as one of the key ways fact-checkers can maintain credibility in the EU.

To that end, the commission announced the creation of an independent network of fact-checkers to develop working methods, establish best practices and achieve broad coverage by collaborating on fact-checking projects. The commission will provide the “network online tools” needed for such an initiative.

The recommendation also stressed the importance of gaining reliable access to data for fact checks. As such, the commission also announced the creation of a “secure European online platform on disinformation” with cross-border data analysis tools and access to open data in all member states, such as statistics.

Still, Nielsen said they could have done more.

“I think it’s a bit of a missed opportunity that the communication does not explicitly call on public authority on all levels to share data more promptly and efficiently, especially with fact-checkers,” he said.

Additional considerations

Beyond explicit recommendations for platforms and fact-checkers, the commission’s communication makes a few other points worth mentioning.

First, the report recommends abandoning the term of “fake news” in favor of a more precise taxonomy, such as the one used in Claire Wardle and Hossein Derakhshan’s report for the Council of Europe.

Second, the Commission will use the 2020 Horizon innovation and research program to explore investing in new technologies like artificial intelligence against disinformation.

Third, it’s holding a high-level conference in late 2018 with member states on how to tackle cyberattacks on elections.

Finally, the commission will aim to improve media literacy by encouraging fact-checkers and journalists to publish materials for schools and include disinformation-specific components in existing campaigns.

Paradoxically, the communication also provides for more cooperation between the commission and the European External Action Service to use strategic communication to counter disinformation. The EEAS’ EUvsDisinfo project, which catalogs alleged instances of disinformation, has come under fire from media organizations for mislabeling their content misinformation.

That project explicitly targets Russian disinformation, so how does the commission’s continued support for it factor into the communication’s efficacy? It’s complicated.

“Europe is a very big place with very different sensitivities,” Cruz said. “The eastern countries in Europe do have an issue with Russian disinformation that I don’t feel exists so much in Italy or Spain or Greece or Portugal … I understand that they have a lot of pressure from eastern European countries.”

So what’s the likelihood that the communication objectives will be achieved? Cruz said it depends in large part on EU elections next year. And as for the platforms, she thinks they could do even more than what the commission is asking.

“I think they need to make a lot more efforts toward addressing the problem in every country and not just globally,” she said. “What they’ve been doing, to me, seems much more like a marketing position than a real interest in solving the problem.”

Nielsen said whatever ends up happening, the EU and other government organizations should think before jumping to regulate against misinformation — it could cause more harm than good.

“I think we need to be cautious before we assume that only regulatory or political action counts as action. Political institutions are not that effective at it — I haven’t seen evidence for that,” he said. “It’s also clear from a long tradition of thinking in this space that unintended consequences and harm are very significant.”

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News