Poynter at SXSW: Algorithms, Journalism and Democracy

Editor’s Note: Poynter will be at South by Southwest, the annual music, movie and interactive festival, March 7-16, in Austin, Texas. Look for our Poynter faculty members, Roy Peter Clark, Ellyn Angelotti and Kelly McBride, and digital media reporter Sam Kirkland. Here is the third in a series of posts on what we’ll be doing at SXSW.

Algorithms control the marketplace of ideas. They grant power to certain information as it flies through the digital space and take power away from other information. Algorithms control who sees what on social-media sites such as Facebook and YouTube, through search engines such as Google and Bing, and even in defined news spaces such as The New York Times, with its lists of most-shared and most-commented features, and Yahoo News.

Just ask some poor guy who’s tried to get his old DUI photo removed from a scurrilous mug-shot site. Having your old mug shot out there in the ether isn’t so bad, except when it turns up on the first page of a Google search for your name. That mug-shot sites were able to make a killing by charging to remove information is a testament to the power of algorithms. That Google and other search engines were able to penalize mug-shot sites (after The New York Times and other news organizations drew attention to the scummy practice) is a testament to the mysterious power of the people who control the algorithms.

This used to be the job of editors, whom we described as gatekeepers. Those editors were flawed human beings, biased by their own perspectives. And it was hard to hold them accountable because their process for making decisions was a private one.

But algorithms are by their very nature biased, meant to give priority to some information and de-emphasize other information. And it’s even harder to determine the biases of an algorithm than it is to determine the biases of a human editor.

If you’re concerned with democracy, you’re in favor of holding algorithms accountable for their impact on the marketplace of ideas.

Nicholas Diakopoulos argues in a paper issued this month for the Tow Center for Digital Journalism that journalists are the natural check on powerful algorithms. His report is aptly titled Algorithmic Accountability Reporting: On the Investigation of Black Boxes.

How can journalists demystify algorithms? First by observing and describing how certain algorithms are working. Then by questioning the assumptions. And finally by reverse-engineering those algorithms to force more transparency into the system.

Diakopoulos offers a methodology for doing so, which includes isolating the algorithm, testing it with a valid sample, talking to sources, and then revealing newsworthy findings. His process requires a certain base of knowledge and familiarity with how algorithms work. But one need not be a computer programmer to do this work — the report cites several examples of such journalism and describes how the reporters arrived at their conclusions.

This method very much follows the scientific method, Diakopoulos writes. I would argue that certain communities and audiences could be enlisted to help with the work.

Deciphering algorithms is more than just determining how they work. It’s also describing why certain information or information providers are so much better at optimizing certain algorithms. For instance, Upworthy got really good at the Facebook algorithm late last year. Then Facebook changed its algorithm, apparently de-emphasizing Upworthy because it doesn’t create original content. As a result, another site, Mental Floss, saw a huge benefit.

Describing what’s happening in algorithms is a critical function of journalism. Why is this type of informed analysis crucial to democracy?

  • It informs citizens and makes them more literate. The more people know about why they organically get certain information and have to hunt for other information, they more they know what to hunt for.
  • It holds the powerful accountable. Most private companies are never going to reveal what values they prioritize. But helping citizens decipher the apparent values gives them the power to pressure companies to be honest brokers.
  • It levels the playing field, sharing information held by a few with the masses.

Betaworks’ Chief Data Scientist Gilad Lotan and I will team up for a SXSW session exploring Algorithms, Journalism and Democracy on Sunday, March 9, at 6 p.m. ET (5 p.m. CT), at SXSW in Austin, Texas.

Related: Poynter at SXSW: The ins and outs of Twibel | Poynter at SXSW: Welcome back to the WED dance

We have made it easy to comment on posts, however we require civility and encourage full names to that end (first initial, last name is OK). Please read our guidelines here before commenting.

  • http://morningdailies.com morningdailies

    Hi there! Good Stuff! Cognitive Bias by algorithm? The problem with these methods really is about the lack of the basic heuristics that human beings (Journalist) are so good at. Perhaps, there should be a convergence of big data and journalism?