By:
December 21, 2005

Chasers should check out a funky-but-civil little debate between Chris “Long Tail” Anderson and Nicholas “Does It Matter” Carr (Caution: it’s not light reading).

Anderson’s premise: People have a hard time trusting information from Wikipedia, Google or a blog “because these systems operate on the alien logic of probabilistic statistics, which sacrifices perfection at the microscale for optimization at the macroscale.” He writes:


When professionals — editors, academics, journalists — are running the show, we at least know that it’s someone’s job to look out for such things as accuracy. But now we’re depending more and more on systems where nobody’s in charge; the intelligence is simply emergent. These probabilistic systems aren’t perfect, but they are statistically optimized to excel over time and large numbers. They’re designed to scale, and to improve with size. And a little slop at the microscale is the price of such efficiency at the macroscale.
To put this another way, any given entry in Wikipedia might be wrong because it is not as carefully vetted as one in, say, Encyclopedia Britannica. But Wikipedia contains geometrically more information on many more subjects than Britannica. So that improves the probability that you’ll find something right on more subjects.

But does that make Wikipedia a better service? Carr’s not so sure:

Might not this statistical optimization of “value” at the macroscale be a recipe for mediocrity at the microscale — the scale, it’s worth remembering, that defines our own individual lives and the culture that surrounds us? By providing a free, easily and universally accessible information source at an average quality level of 5, will Wikipedia slowly erode the economic incentives to produce an alternative source with a quality level of 9 or 8 or 7? Will blogging do the same for the dissemination of news? Does Google-surfing, in the end, make us smarter or dumber, broader or narrower? Can we really put our trust in an alien logic’s ability to create a world to our liking? Do we want to be optimized?
Some thoughts to add to the mix:


  • The other big variable to consider on probabilistic information systems is credibility — of both sources and methods. With Wikipedia, for example, information sources are mostly anonymous to everyday users. One must trust that the community around Wikipedia will catch up to inaccurate or dated information, but rarely knows who put it there in the first place. With Google, one must decide whether the method of delivering search results worked as expected.

  • A Google, therefore, has a lower barrier to human trust than a Wikipedia because it’s pretty easy to tell from a few searches whether the Google system has what you want. Wikipedia may have an entry on the subject you want, but it is not nearly as easy to determine whether that entry is accurate. Google is an algorithmically sorted database at its core; Wikipedia is a collection of human effort. It’s harder to trust an anonymous information source than an algorithmic search engine.

  • Most blogs remove anonymity from the equation, but Anderson refers, we think, to the universe of blogs as a probabilistic system, not any one blog in particular. Still, a blogger can build credibility and trust over time, and not all bloggers attempt to be portrayers of fact as much as opinion leaders.

  • If Anderson is right, and everyday humans can’t grok probabilistic systems, maybe that opens up a lot of new opportunities for information systems designed to work the way humans expect and can trust. Maybe that means narrowing focus to improve accuracy and build credibility and, in the process, zeroing in on information value propositions that are of manageable size.
People trust a calculator because its results are predictable. People trust other people because they know them. The latter seems to be the kind of trust relationship that local media want to have with their consumers, contributors and customers. Chasers who want that may be best served by focusing on finite, reliable information sets, rather than trying to grow their own large, probabilistic systems.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
http://smallinitiatives.com/
Jay Small

More News

Back to News