March 7, 2011

Last week, my news organization announced we were evolving our online commenting practices a bit to improve the quality of discourse on NPR.org.

Our comment threads drew some attention recently when a comment thread about the brutal assault on CBS Correspondent Lara Logan in Egypt went awry, prompting the removal of dozens of comments and an editor’s note reiterating the discussion guidelines.

Meanwhile in another corner of the Web, a related discussion of sexual harrassment of women in Egypt unfolded with civility, thoughtfulness, and occasionally even erudition (and also – fair warning — some profanity here and there).

What gives? Why are these two online conversations so different? I spoke with community managers at both sites — NPR’s Eyder Peralta and MetaFilter’s Jessamyn West — to put together a framework for how to think about online comments. What follows is distilled from those conversations.

Five key principles of online conversations

Don’t blame (or credit) “The Internet.” It’s your community that matters. For some reason, while it’s easy for us to understand the dynamics of different places and people in the offline world, we still talk about the online world as one undifferentiated blob. When referring to the offline world, we talk about “bad neighborhoods” and “angry mobs” and “obnoxious loudmouths.”

But every conversation about bad online discussions seems to ascribe the failure to “the Internet” rather than a specific online space, although the particular dynamics of a place matter online as much as they do offline. NPR.org, like many websites, encompasses many different community spaces; some feel friendly and personal, others feel like a muddle of passing strangers.

For better outcomes, use better filters. Clay Shirky famously said, “There is no such thing as information overload, there’s only filter failure.” This is excellent because it frames the problem around its solution. You don’t have to passively accept the condition of drowning in too much information, you have to find and deploy better filters. You can frame the problem of bad online conversations the exact same way — it’s only filter failure.

To understand this, you might have to broaden the scope of what you consider a filter. You’re familiar with a very common example — the spam filter. Having to register to comment on a site is another example of a filter. But lots of filters don’t rely on algorithms or databases — a commenting policy is a filter, as is a human moderator.

The very best filter is an empowered, engaged adult. Whether online or offline, people act out the most when they don’t see anyone in charge. Next time you see dreck being slung in the bowels of a news story comment thread, see if you can detect whether anyone from the news organization is jumping in and setting the tone. As West put it, news organizations typically create a disconnect between the people who provide content and the people who discuss that content. This inhibits quality conversation.

One of the most frequent excuses for our non-participation in the comments is lack-of-time; there just aren’t enough hours in the day for journalists to step into each comment thread and engage. But as Peralta pointed out, journalists do routinely read threads attached to their stories. Jumping in every now and then to show you’re paying attention is more productive than e-mailing a Web editor after an unmonitored thread goes off the rails.

The difference between conversation and graffiti. Check the comments on a story on your average news site and you’ll find users mostly reacting to the piece. Dip into a thread on MetaFilter, though, and you’ll find users responding, repeatedly, to each other. This quality — when people come to listen to one another’s words, instead of just coming to make themselves heard — is a key signal that you’re in a genuine community. It’s difficult to foster, and news sites are particularly inhospitable to it.

Many commenters treat news stories like bathroom stalls — transitory outlets for anonymous self-expression while no one’s watching. Healthy online communities, on the other hand, can feel very similar to offline communities — your book club, your favorite bar, your office — with familiar faces, in-jokes, and social norms. As Peralta pointed out, it’s easier to cultivate that true community feeling in a blog than it is in news story comments.

The output of a great community is great content. MetaFilter is a very popular website that consists entirely of links, questions and comments posted by users. The ugly phrase “user-generated content” obscures just how delightful that content can be when the community is strong. The upshot of this principle is that some of the best community leaders on the Web treat their crowds much the way editors treat their reporters. Here’s a dissection of how online community impresario Ta-Nehisi Coates assigns, edits, curates and promotes content from his community.

Five key aspects of online commenting environments

The particulars of how different commenting environments work can have a giant effect on the type of discussion that thrives there. If you’re trying to alter the nature of conversation on your site, adjusting one or more of these aspects of the environment can help. Today, many popular providers of commenting engines on the Web (like Disqus or Intense Debate) offer fairly sophisticated tools to control these factors.

Authentication: Do you ask users to create an account to comment, or to provide a name and e-mail address? Can users log in through their Facebook or Twitter accounts? If you offer few barriers to posting a comment, you might get a lot of comments without much quality. But if the barrier to comment is too high, discussion might be anemic.

If you’re finding that your community is growing beyond your means to manage it, one good response might be to set a higher threshold for joining. One threshold some communities have set, for example, is that new users are “auditioned,” requiring a certain number of their comments to be approved by human moderators before they can post automatically to the site. (This is part of NPR’s new commenting strategy, for example.)

When MetaFilter began to be flooded with sign-ups from new users, site founder Matt Haughey initially closed off new memberships, then later required a $5 “cover” to become a lifetime member. Yep, that’s right. There’s no paywall to read, but there is a paywall to comment.

But before you go getting any big ideas, there can be a disadvantage to this, too: Users who pay expect a higher degree of customer service. And keep in mind that there was a strong community of users devoted to the site — and frustrated with the behavior of new, unassimilated users — before Haughey took this step, so there wasn’t a big revolt afterward. The community has kept growing at a steady, manageable clip since.

Reputation and scoring: Authentication is closely tied with reputation. If users have persistent accounts that they log into and comment from, and if other users can note that they like (or dislike) particular comments, the community can identify members over time whose contributions are frequently valuable.

Some Web communities allow anonymous comments, but keep them hidden by default. (Some do this, but display the comments if they receive enough thumbs-up from registered users.) Reputation can also be purely time-based — on MetaFilter, new members can’t post to the front page of the site until they’ve been around for at least a week.

On Slashdot, each comment is scored by users, and readers can use those scores to filter which comments are shown in each thread. This means that on high-volume threads, users can hide comments that the community hasn’t found particularly valuable. Comments by established users get a boost to their default score.

Moderation: West said that the homegrown tools that Haughey has developed for MetaFilter over time all help her to do her job better. She can, for example, leave notes on user pages that only other moderators can see, or give an individual user a time-out so his account is inactive for a few days. But she said that the most useful moderation tools are the most basic ones, such as posting or deleting comments to keep threads on track, or sending messages to users.

MetaFilter also has an incredible space called MetaTalk, which exists purely for users to talk about MetaFilter. If a user has a problem with the actions of another user or a moderator, they can take their issue to MetaTalk instead of derailing the MetaFilter thread.

When Haughey is considering new features or policies, MetaTalk is a place where he can hash out his thoughts with a subset of the site’s most devoted users. I stole this idea when I was creating an arts-and-entertainment community for the Minneapolis Star Tribune, and I highly recommend stealing it yourself. But even in the absence of a MetaTalk-like space, there’s usually nothing preventing you from creating at least a post or a new thread for users to talk about community practices when appropriate.

Policies: Discussion policies are the backbone of effective moderation when they are clearly articulated and consistently enforced. MetaFilter has a very firm policy against users posting links to their own work to the front page of the site, which users are reminded of every time they start a new thread.

Over time, the community has grown an entire wiki elaborating on the basic site guidelines. But the balance between clear, simple guidelines and arcane sets of rules can be very difficult to strike. Lean too far toward the latter and you risk creating a community that becomes more about preserving rules than engaging with one another.

Threading: The format of discussion on a site can have a strong influence on the types of discussions that result. MetaFilter doesn’t feature conversation branches, which means that replies to particular comments aren’t indented or set off from the rest of the conversation in any way. “We like the fact that everyone sort of needs to talk to everyone else,” West has explained.

Increasingly, the more common convention on the Web is to have branched conversations. This can allow for more discursive discussions that go off in many different directions, but it can also make it daunting to keep track of the flow of conversation. Some commenting environments allow administrators to limit how deeply conversations can branch, restricting users from replying to a response to a response to a comment, for example.

Five tips for fostering great conversations

Learn the ladder of escalation. Most commenting environments give their administrators access to a wide spectrum of possible responses when a commenter is having a bad day. West and Peralta both mentioned that leaving a comment in a thread to cut the tension is a great first step when a discussion begins to go awry. A behind-the-scenes, one-on-one e-mail is a potent-yet-mild corrective when a particular user is causing trouble.

Peralta estimated that a private e-mail fixes the problem 95 percent of the time. If a full-on knife fight is underway in the comments, you might delete some comments and gently set the discussion aright. (Make sure to read David Ardia’s wonderful explanation of why you’re allowed to delete comments without taking on liability for what everyone says on your site.) The last step is blocking users from commenting, or banning them, which West says only typically happens to spammers on MetaFilter.

Practice aikido. Aikido is the martial art of redirection, said West. When someone’s trying to punch you in the face, instead of blocking his fist so he punches your hand, you redirect his fist to your side, using his own force to trip him up. Similarly, great community management often involves disarming folks who’ve got their dander up, defusing heated moments in conversation, and channeling a commenter’s passion into directions more productive than yelling at other users.

West does this brilliantly in her interactions on MetaFilter, as you can see by scanning her history of comments. She conspicuously avoids stern warnings or wrist slaps; her admonitions, by and large, are simply constant reminders that the users on MetaFilter are people, that the site is a community worth valuing and respecting. Please be nice to each other, please act like you like this place, make an effort. West said she gets a lot of apologies from users in her e-mail, even those she hasn’t said a thing to.

You don’t have to prove anything. It is crucial to develop or adopt or at least feign a thick skin if your goal is cultivating a good conversation. This might be the single biggest pitfall for reporters, all of whom have poured care and effort into crafting a terrific piece of journalism, only to scroll to the bottom and see themselves described as a lazy bias idiot who can’t even spell correctly. Every adrenal impulse developed in the long history of human evolution tells us to either disengage or come out swinging in these moments. But both staying out of the comments and jumping into the fray might mean inviting a pile-on.

Fortunately, there’s another way to engage. West narrated it this way: “Let me model some good behavior for you. That person said something really s—-y. I’m just going to ignore it. Watch me.” That’s not a natural reaction for a lot of folks, she said, but it gets results. The word why can also be a useful tool, Peralta said. They’re allowed to think you’re stupid, but explanations are always helpful.

Assume good faith. Sometimes it’s difficult to read a racist, sexist, vicious comment and not jump to the conclusion that the commenter is an evil troll. But most of the time, West said, it really is just a bad day or bad time, and they’ve forgotten that there are people behind those usernames and avatars. This Jeff Pearlman column is a terrific reminder of that fact. True, persistent trolls are a tiny minority of the population.

Be accountable. Whether you’re a reporter explaining some of the choices you made in a story, or a community manager explaining the decision to delete some comments, it can be freeing to acknowledge that you’re human and you might not have made the best call. Admin rights let you delete comments, West said, but they don’t necessarily get you respect.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
I serve as an Editorial Product Manager at NPR, where I work with member stations to develop niche websites. Before coming to NPR, I worked…
Matt Thompson

More News

Back to News

Comments

Comments are closed.