April 3, 2017

Journalism has a trust problem.

In the run-up to the 2016 election, a Gallup poll revealed Americans’ trust in the press had sunk to an all-time low, with just 32 percent surveyed saying they felt “a great deal” or “a fair amount” of trust in the mainstream media.

Whether it’s criticism from the Trump administration, the failure of the media to correctly forecast the outcome of the 2016 election or the rise of ideological filter bubbles on social media, there’s a growing rift between news organizations and the consumers they exist to serve.

That’s why a coalition of funders, academics, tech leaders, academic institutions, nonprofits and funders, including Facebook, The Knight Foundation, Mozilla and Craigslist Founder Craig Newmark, are banding together to figure out solutions.

Earlier today, they announced the News Integrity Initiative, a $14 million project that aims to combat declining trust in the news media and advance news literacy. (Disclosure: The Knight Foundation funds Poynter’s coverage of innovation in local news, Newmark funded the creation of a chair in journalism ethics at Poynter and Poynter has partnered with Facebook to create a journalism certificate.)

Related Training: Room for Trust: Creating Space for Real Engagement

The program is being administered by the Tow-Knight Center for Entrepreneurial Journalism at the CUNY Graduate School of Journalism in New York. Before the program was announced, we caught up with Jeff Jarvis, the director of the Tow-Knight Center, for a conversation about the scope of the initiative and what it would take to restore the public’s trust in journalism. Our interview has been edited and condensed for clarity.

When I read this announcement, my first reaction was: Facebook seems to be serious about encouraging news literacy.

Yes, absolutely. Facebook and the other funders. But I would suggest that this is news literacy and more. And even news literacy should be defined broadly.

The problem that I have with the term “news literacy” is it’s a little paternalistic. It says, “If you read our news, you’re literate. If you don’t, you’re not.” I think the public is so much more involved in the process of news creation and distribution now that we have to rethink it as it’s happening. I think we have to make journalists more public-literate, too.

So yes, it’s about news literacy. But it’s also about the informed conversation, which is the way I look at it: How do we do a better job of informing the conversation that exists in a democracy?

We saw the need to convene people. Not compete with them, but to help them share best practices and needs and not duplicate. We saw the need to bring others into a conversation that’s been pretty much media-centric. So we want to make this a little more public-centric.

And we also saw other centers of media that need to be brought in. Last Thursday, we had a small summit here under Chatham House Rules bringing in ad agencies, brands, ad networks, ad technology companies, VR companies, the platforms, publishers and so on. And we made some progress there.

Facebook could be accused of being one of the organizations that has contributed to an erosion of the economic model that has supported trustworthy journalism. I don’t believe that news organizations were entitled to the revenue that companies like Facebook scooped up. But one could make the argument that this is too little too late now that they’ve totally trashed the business model for journalism.

You’re speaking to the other half of my life. This initiative is not specifically focused on business, but the rest of my life at Tow-Knight is focused on business. So I spend most of my time worrying about just that. And what I tell my own colleagues in media is that I don’t think Facebook is a media company, or that Google is. I know that’s a minority opinion. I think we tend to look at something that looks like what we do and say, “that’s us.”

We risk defining them in our terms and not understanding what they truly are. They’re connection machines. They connect people with each other and people with information, and there’s a lot of opportunity for us there that we have to understand. What I’ve been arguing — and I’ve written this all over creation, over the last two years — what I want to see from the platforms is help in reinventing the relationship and the business model of journalism for the reality that we have now.

I think we in journalism can do a lot to teach the platforms about journalism and public responsibility. I think they can do a lot to teach us about resetting our relationship with the public we serve and how we can better inform the public conversation together. Because that conversation isn’t happening solely on our site anymore. It’s happening all across the net.

It’s not as if God gave us the revenue that we had. New competitors came along and offered advertisers better deals. But I see the platforms — both of them, Google and Facebook — trying now to help journalistic enterprises. Instant Articles and the new YouTube white label player and so on. I think we see efforts to help the business of journalism.

I hope they can help us strategically. It’s not about giving money to news organizations. It’s about giving strategic wisdom, innovative thinking, data and other capabilities to us to help strategically improve our businesses. That’s what I want to put myself at the center of.

Can you talk about the way you think about news literacy? You said earlier that this is not the “news literacy initiative,” it’s the News Integrity Initiative. What’s the difference?

I went to a great event that Arizona State held a few weeks ago held by my good friend Dan Gillmor. And he’s been arguing, eloquently, that we need to think not just about the supply side, but the demand side.

And so that’s another way to say what I’m trying to say, which is we have to think about the public conversation. And there are good efforts in schools…around news literacy, and I think those are important efforts to have the mechanisms to make more critical judgments about the veracity of what they read in the world now.

But I also think there’s a lot more we have to do to regain trust in news. And it goes far beyond saying that the news that we have now is the news we’ll always have. It goes to say that we can reinvent what we do — not because we have to — but because we can.

We have new opportunities. I want to see us take journalism to people where they are. That means we have to take journalism to conversations as they’re occurring on Facebook and Instagram and YouTube and wherever that might be.

And that means we’ve got to get out of our mindset of making destination products. So I guess all I’m trying to say about news literacy is that what I saw at ASU was a broader discussion, thank goodness, than just getting more people to read what we already produce. What I saw at Misinfocon, the week before that, was a broader discussion about reinventing news technology to reinvent what we do. I also hope that we have a discussion that’s not just about the bad stuff, it’s also about what I hope I see as a flight to quality.

You mentioned the idea of bringing journalism to places where conversations are already happening. What would that actually look like?

Wherever news and information are, and wherever conversations about society are occurring, that’s what we have to worry about.

Every year I go to a conference called Vidcon, with my daughter’s help. What Vidcon taught me — and I wrote about this at the time — is that media is not a destination. That what we should be making are social tokens that people use in the course of their conversations. When my daughter shares something, she’s not doing it because, “this is a good product, you should watch!”

She’s sharing it because she’s saying, “this speaks for me,” or, “this speaks to our conversation.” It’s part of her conversation. We have to start remaking news to be that. So when we talk about news literacy, it’s not just about how you use a product called news. It’s how we change the service that we have called news to bring it more into conversations and make people’s conversations more information-literate.

That’s a much bigger task, but it’s the task at hand.

Putting up funding is a great step from Facebook. But I think the big step Facebook could take would be something like adding additional context for news articles or we’re going to prioritize quality content over cheap listicles. It seems like the real commitment from these companies would to change their platforms to increase the trustworthiness of the news on it.

Yes. But I think the mechanisms, in great measure, are already there. After the Orlando tragedy, the mass murder there, I saw a photo on my Facebook feed of Justin Trudeau kissing a male politician. My journalistic antennae went up. And I said, ‘why haven’t I seen this before? This seems like the kind of thing he might do, because he’s Justin Trudeau making a point.

The person who’s sharing this wasn’t exactly on top of things, so my suspicion was right. My critical faculties came out. I moused over it, debating whether to go over the rat hole and try to figure out where this came from.

And boom, as is Facebook’s wont — this is now over a year ago — they pop up related content. Related content that popped up was BuzzFeed saying “people are passing around a fake picture of Trudeau and Snopes debunking it.” The mechanism is there, I think, to do exactly what you’re saying, which is to give people more and better choices at these moments.

I think there’s a role for media in that, and I think we’ve got to have the mechanism to alert Facebook, to put metadata on our news to say: “We just reported that thing. And every time that picture comes up, you might want to give people the opportunity to know this and do it in a shorter way.”

Keep in mind, under the First Amendment, we have a right to lie, too. You can choose to lie. You may be passing around something stupid to mock it. You may share it because it’s an indicator of how another side thinks. I want to be very, very careful here that I don’t want to put Facebook or Google in the position of being the editor and censor of the world. However, I do agree with you that they can each give higher-quality choices to people.

Well, then it’s up to us to help them define quality, then. That doesn’t mean it eliminates the speech that now can occur. I’m in favor of free speech, and I think that’s a good thing. It does, however, say that we can help them with more quality information.

I’ll give you another example, with Google: If you searched some time ago for “climate change,” you got good results. If you searched for “is climate change real?’ you got less good results. That’s no longer the case.

And that indicates to me that Google has chosen to favor the institution of science. I don’t know that that’s true, and I haven’t heard that from them, but I saw an improvement in the quality of those search results. They were less about anticipating the desire of the user and instead about saying that people are asking a question about a topic, and we have to give them a good answer.

And so there’s a bias in that case toward the institution of science. I think that’s a fine thing. But this is at a time in which we have an institutional revolution going on. So, it’s a complex question.

I think the other thing they could do — John Borthwick and I wrote a piece in the midst of the fake news brouhaha with 15 suggestions of what to do. One of John’s big points was that we need the platforms’ help as users to track back to the source information, news, rumors and memes.

So if you see this link from The Denver Guardian and it’s only two weeks old, that’s an indicator, a signal, that you might want to be suspicious. At the same time, I think we also need to surface media brands more prominently. If you go on Facebook and look at The Guardian, every story they put up has a photo and a blue Guardian bar on it. so you have clearer branding there, and you get to know what the source is. I think the platforms can do a better job of revealing source — in both senses of the word.

Have you already explored what some potential projects in this space would be? Can you give me a sample?

I’ll mention a couple of things that I’m thinking about just as an early way to get going. When we held the event last week with the platforms, publishers and technology folks, we talked about the research that they want.

Folks from Microsoft Research were very eloquent about saying that we don’t really know what the solutions are yet before we come up with the problems for it. The source, the supply, the distribution and the absorption of news — they argue we need to do some more research on that, which I think is a very good idea.

I was at another event not long before that when there was debate about the definition of being informed. How are we going to judge our success or failure? I think there are research projects like that that are foundational to get going.

Even before that, however, as my colleague Carrie Brown here at CUNY says — because she’s in the PHD and I’m not — there is a lot of research around some of these areas. So the first thing I want to do is to hire somebody to go through the research that exists, find those things that are relevant and translate it into layman’s language for the likes of me.

So I think there’s a foundation of research that I want to get going very quickly. Then, at the event we had last week, there was a fascinating discussion — and I haven’t written about this yet — a very tangible discussion to say that there is an opportunity, and a need, to gather, generate and share signals about content and creators across platforms, ad networks, publishers, ad technology companies, and so on.

There’s no one answer to what’s quality and what’s not. However, there are signals. Like, “Jeez, The Denver Guardian’s only been around for two weeks. You might want to look at this suspiciously.”

So what we’re looking at already is a structure to try to pull together these signals and figure out a way that we can share them so that platforms and ad networks can use their own algorithms, use their own formulae, to weight these signals their own way to determine where they want to put their effort. That, in turn, I think, then, trickles down to the users themselves.

Because the more signals we have in this way, the more it can inform ranking decisions on places like Google and Facebook. So, I think that’s a project that I want to get going on quickly. We want to do some further convenings internationally. I think the days of the fake news conference, of ‘woe is us,’ is over. Now we have to go onto solutions.

So you might try to create a cross-platform metric for newsworthiness?

I want to be careful here. Not a metric. I think there’s no God metric of trust. (But there are indicators.) So for example, does this publication adhere to an editorial policy? Does this publication do fact-checking? (Sally Lehrman, director of the Trust Project) has eight things that she’s trying to build. So those are signals that you could give to ad networks, platforms and so on.

That’s one set of signals. Add to that things that the platforms know: How old is this site? That’s a signal. Add to it signals about being a trusted media brand. What I’m suggesting here is that there are a lot of signals you can use.

And then, an ad network can come along with a given client and say, “we’re going to weight that one signal less, and that signal more” — that’s up to them. The question is, can we provide these signals to them? And can those who have data already — ad networks, platforms and such — share data?

Can we become a middleman here to enable the sharing of that data? Can we make everybody who is pointing dollars and traffic to sites smarter about what they do? I don’t know the answer to this. It’s something we want to start exploring.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Benjamin Mullin was formerly the managing editor of Poynter.org. He also previously reported for Poynter as a staff writer, Google Journalism Fellow and Naughton Fellow,…
Benjamin Mullin

More News

Back to News