Factually is a newsletter about fact-checking and accountability journalism, from Poynter’s International Fact-Checking Network & the American Press Institute’s Accountability Project. Sign up here.

Why ‘facts won’t save us’

Whitney Phillips, an assistant professor of communication and rhetorical studies at Syracuse University, wrote a piece for Columbia Journalism Review saying that disinformation, like environmental pollution, calls for an ecological solution. “Facts won’t save us,” she wrote.

Phillips is the author of two books as well as an important paper on disinformation amplification for Data & Society. Now she’s collaborating with Ryan Milner, a communication scholar, in a forthcoming book, “You Are Here: A Field Guide for Navigating Polluted Information.”

She agreed to answer a few questions for us this week.

1. In recent years, we’ve heard people talk about a “war” on misinformation, even including the suggestion that citizens have a patriotic duty to fight it. You argue that it should be approached from an ecological perspective. What do you think are the similarities/differences in these approaches? 

The book definitely frames these issues in terms of healthy, engaged citizenship; and that strikes me as pretty fundamentally patriotic! Not toward any specific country, but rather democracy itself. Something must be done, and must be done soon, about the spread of polluted information. In order for that to happen, we argue, we have to start thinking differently about the problem.

That’s where an ecological framework comes in, which approaches the issue as an outright network crisis — one akin to the climate crisis. In the CJR article and book, that framing begins with our use of “polluted information” rather than “mis- or disinformation.” By adopting that term (building on Claire Wardle’s framing), we’re first of all sidestepping efforts to distinguish deliberate falsehood from inadvertent falsehood — because it’s often not clear who posts what for what reasons.

What the metaphor of pollution does, along with the other metaphors we use throughout the book (we have chapters focused on redwood roots systems, land cultivation and hurricanes) is encourage reflection on how deeply connected we are to each other and to the environment. It also highlights how everyday people contribute to pollution’s spread – even when they’re trying to help, say by retweeting something in order to condemn it, or to make what they think is a harmless joke, or, yes indeedie, to fact-check a false claim.

The idea that we can still pollute even when we’re trying to help can be distressing – but we all play a role in what the internet is like. We therefore can all play a role in how we start cleaning it up.

2. Who is your field guide for — journalists, readers, citizens in general?

The book is for everyone, reflecting our argument about the dense interconnection between networks, platforms, industries, you name it. Just as in an ecological system, everything online is plugged into everything else. Same goes for readers. Obviously a journalist at a large publication will have a wider social media reach than an everyday person with a few hundred followers; obviously that journalist can spread much more information much more quickly than the average person.

But everyday people still contribute to what information spreads across their networks through retweets and clicks and likes. Even if indirectly, those choices ultimately help shape— drumroll — what journalists end up writing about. Conversely, everyday people take significant cues from the things journalists publish. Compounding this relationship, both journalists and everyday citizens are influenced in all kinds of opaque ways by the algorithms that push certain things, but not others, into their eyeballs. Around and around, with algorithms feeding and being fed by everyday citizens, who are feeding and being fed by journalists, who are feeding and being fed by the algorithms.

We do draw one line in the book, however. That’s the line between citizens of good faith, those who – regardless of political affiliation – recognize the problem and want to do something to help strengthen democracy (or at least not make things any more polluted), and citizens of bad faith — those who are actively trying to dismantle civil society, and/or who only care about themselves/their brands/maintaining a stranglehold on their supporters, everybody else be damned. I don’t know what could convince citizens of bad faith to care about our shared world or to see all the ways we are fundamentally interdependent Luckily, there are so many more citizens of good faith, and so that’s where we’re putting our energy.

3. Getting practical here, what steps can journalists take right away to do their part to help clean up the pollution? 

The most critical advice we’d give journalists about minimizing the pollution they produce is the same advice we’d give anyone else: None of us — not me, not you, not anyone — ever stands outside the world we’re commenting on.

We are all, always, right in the middle of it. Journalists (and everyday people as well) like to think — because it’s easy to think — that their responses to particular stories aren’t part of that story. But how and when and if a journalist (or anyone) chooses to react to something influences the size, speed and trajectory of the storm.

Enter our hurricane metaphor. As we argue, nobody would ever point to a single gust of wind and say, “That’s a hurricane;” instead, hurricanes emerge out of a whole litany of forces, from water temperature to wind speed to the rotation of the earth’s axis. Similarly, it wouldn’t make sense to point to one element of a story – one presidential tweet, for instance – and declare that that’s the hurricane. Instead it’s a confluence of everything all at once, including the energy afforded by all the people reacting to the tweet or hoax or conspiracy theory or whatever.

Telling that story without highlighting the role a person’s own amplification plays (especially for journalists, but again, also for everyday folks) doesn’t just tell a less complete, and ultimately less true, story. It can push the hurricane to more dangerous places, all because that person doesn’t recognize where and how they fit when they look up, look down, and look side to side at the world around them.

. . . technology

  • Facebook is complying with Singapore’s controversial new law aimed at curbing misinformation by adding a disclaimer to stories the government says violate the law. “Facebook is legally required to tell you that the Singapore government says this post has false information,” the disclaimer said, according to The Wall Street Journal.
    • The wording, one human rights advocate told The Washington Post, represents the “legal minimum” and signals that Facebook is not supportive of Singapore’s requirement.
  • And now China also has new rules governing video and audio content online, including a ban on “fake news” created with technologies like artificial intelligence and virtual reality.

. . .  politics

  • Politico laid out four ways in which the challenges of fighting disinformation are evolving in the 2020 election cycle.
    • Among the key points: “The election interference tactics that social media platforms encounter in 2020 will look different from those they’ve been trying to fend off since 2016.”
  • Russian trolls may not have significantly polarized the American public because they mostly interacted with those who were already polarized, according to a new study from eight Duke University researchers. Trolls and bots are “a symptom of our polarized political environment, not the cause of it,” wrote Alex Shephard in The New Republic.

. . .  the future of news

  • Report for America’s plans to add 250 reporting positions in newsrooms across the country will include a PolitiFact fact-checker at the Detroit Free Press.
    • PolitiFact Editor Angie Drobnic Holan said the project will cover messaging in the 2020 presidential election and fact-check local issues in Michigan, as (Poynter-owned) PolitiFact does with partnerships in 13 other states. Last month, we noted a Washington Post editorial saying that President Trump’s attacks on the media as “fake news” are emboldening despots around the world to do the same. Now, The New York Times editorial page has made a similar argument, including an interactive showing exactly where it’s happening.
  • Last month, we noted a Washington Post editorial saying that President Trump’s attacks on the media as “fake news” are emboldening despots around the world to do the same. Now, The New York Times editorial page has made a similar argument, including an interactive showing exactly where it’s happening.

Brazilian fact-checker Alessandra Monnerat, from Estadão Verifica, caught two popular right-wing websites mixing the results of three different polls to “prove” that President Jair Bolsonaro’s approval rate had gone up.

Monnerat’s work showed that the misleading articles ignored the fact that polls based on different methodologies can’t be compared or mixed together. It exposed how the infographic used to illustrate the story was misleading. It also alerted Brazilians that those two websites had left out all the statistical information that was available last week about Bolsonaro’s disapproval rates.

According to all three research institutes cited in the piece (Ibope, Datafolha and XP/Ipespe), Bolsonaro’s disapproval rates have actually risen since he became president.

What we liked: After Monnerat’s fact check was published, one of the websites issued a correction. On social media, the misleading post had been shared almost 7,000 times. Following the big impact of her work, Monnerat said on Twitter that this fact check was only possible because she had done three workshops on how to cover electoral polls.

  1. The Partnership on AI and First Draft are launching a research fellowship to investigate tactics for effectively communicating video manipulations to the public.
  2. Tech firms under fire on political ads are trying every response but the right one, The Washington Post wrote in an editorial.
  3. The Indian Press Information Bureau has set up a fact-checking unit to verify news related to the government and is asking people to email snapshots of “dubious material” they come across on any platform.
  4. A Seattle Times writer explained this week how a new University of Washington initiative aims to combat digital counterfeiting and misinformation.
  5. The Nigerian fact-checking organization Dubawa is sponsoring a “Week for Truth” campaign that will include events to engage young students, professionals, entrepreneurs, online content creators and ordinary Nigerians in hands-on activities to “explore the intersections between freedom of expression, civic engagement and fact-checking.”
  6. Brazilian lawmakers invited executives of all IFCN’s verified members in Brazil – Agência Lupa, Aos Fatos and Estadão Verifica – and other fact-checking organizations to explain who they are and how they work. At least three sessions have been held so far as part of a parliamentary investigation commission created to discuss the spread of “fake news.” All fact-checkers agreed that creating new laws to address mis/disinformation isn’t the best way to handle the problem.
  7. Chequeado, in Argentina, reported that President Mauricio Macri, who will leave office Dec. 10, fulfilled only two out of 20 promises he made during his 2015 campaign. The fact-checking organization is ready to start monitoring the new Argentinian President, Alberto Fernández.
  8. Fact-checkers Clara Jiménez Cruz (Maldita.es, in Spain) and Emmanuel Vincent (Science Feedback, in the United States) have been announced as 2019 Ashoka Fellows. The institution recognizes them as social entrepreneurs and change makers.
  9. Emily Bell wrote a long piece for CJR about how fact-checking has adapted to cover misinformation and become a booming industry over the past several years.
  10. Russia has waged a disinformation campaign aimed at turning Lithuania against the NATO alliance, DefenseOne reported.

That’s it for this week! Feel free to send feedback and suggestions to factually@poynter.org. And if this email was forwarded to you, you can subscribe here.

Daniel, Susan and Cristina

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Tags:
Cristina Tardáguila is the International Fact-Checking Network’s Associate Director. She was born in May 1980, in Brazil, and has lived in Rio de Janeiro for…
Cristina Tardáguila
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News