By now, it’s a common refrain: Facebook insists that it’s not a media company, even though it does exactly what most news organizations do: Show its users advertising that’s adjacent to content relevant to their interests.
The “we’re not a media company” claim is meant to obviate Facebook’s responsibility to make the kind of discerning choices about content that human editors make — an impossibility given the sheer volume of photos, videos, images and stories uploaded by the social network’s 1.8 billion monthly active users.
Despite that claim, it is increasingly running into the messy realities of the publishing business, turbocharged by its enormous reach and massive scale.
The latest example of this contradiction was on display Sunday after Facebook user Steve Stephens uploaded a video of himself killing a man and boasting that he murdered several other people. Facebook removed and condemned the video, but not before it had spread widely and raised questions about the company’s ability to police its own social network for horrific content.
This isn’t a new problem. In October, the Pulitzer Prize-winning news photograph dubbed “Napalm Girl,” which depicts a child from 1970s-era Vietnam weeping, ran afoul of the social network’s standards against nudity. It was pulled down, then reinstated after the editor of the Norwegian newspaper Aftenposten made a personal appeal to Facebook founder Mark Zuckerberg (calling him “the world’s most powerful editor”) and citing the photo’s newsworthiness.
Related Training: How to Cover Big News As It Breaks
As with “Napalm Girl,” Facebook is treading territory that has been covered by news organizations before. Newsrooms have long debated the value of publishing graphic or troubling images that may shed light on injustice, violence, poverty or squalor. And they’ve also grappled with broadcasting violence. In 1974, Christine Chubbuck, a Florida TV anchor, shot herself live on television. R. Budd Dwyer, a Pennsylvania State Treasurer, killed himself mid-press conference. And in 2015, Adam Ward and Alison Parker, journalists for WDBJ in Virginia were shot to death on camera as they covered a story outside the newsroom.
As news of the shooting spread Sunday, several journalists and media thinkers flagged the parallel to media companies and noted that the shooting could imperil Facebook’s plan to bolster its live video capabilities.
Facebook Live worst case has happened. FB has to decide if it wants to continue with it knowing risk to reputation/ public safety
— emily bell (@emilybell) April 16, 2017
because they are accountable to their brands and their audiences and their advertisers. I think that makes them "media companies." https://t.co/9DD3jKNX07
— Jason Kint (@jason_kint) April 17, 2017
— John Koetsier (@johnkoetsier) April 17, 2017
Writing in Axios, media trends reporter Sara Fischer said Facebook’s response to the shooting could “could set a lot of precedents for everyone in the digital ecosystem.”
Livestream and crowdsourced content involves risk for everyone: platforms, publishers, consumers and advertisers. There’s no real regulation around either, forcing everyone to make some tough decisions around how to weigh the risk of an imperfect technologies.
It would be difficult for Facebook to prevent a repeat of Sunday’s shooting while preserving the universal access and spontaneity of Facebook Live. But, as with the spread of fake news in the runup to the U.S. election, the social network has shown a willingness to adjust its platform to forestall malicious content if the clamor gets loud enough.
Correction: WDBJ reporter Alison Parker and video journalist Adam Ward were killed in 2015, not 2014.