June 18, 2018

For a brief moment, the California Republican Party supported Nazism. At least, that’s what Google said.

That’s because someone vandalized the Wikipedia page for the party on May 31 to list “Nazism” alongside ideologies like “Conservatism,” "Market liberalism” and “Fiscal conservatism.” The mistake was removed from search results, with Google clarifying to Vice News that the search engine had failed to catch the vandalism in the Wikipedia entry.

Google has long drawn upon the online encyclopedia for appending basic information to search results. According to the edit log for the California GOP page, someone added “Nazism” to the party’s ideology section around 7:40 UTC on May 31. The edit was removed within a minute, but it appears Google’s algorithm scraped the page just in time for the fake.

“Sometimes people vandalize public information sources, like Wikipedia, which can impact the information that appears in search,” a Google spokesperson told Poynter in an email. “We have systems in place that catch vandalism before it impacts search results, but occasionally errors get through, and that's what happened here.”

The Wikimedia Foundation, a nonprofit organization that operates Wikipedia, also issued a statement on Twitter.

According to Google, more than 99.9 percent of Wikipedia edits that show up in Knowledge Panels, which display basic information about searchable keywords at the top of results, aren’t vandalism. The user who authored the original edit to the California GOP’s page did not use a user profile, making them hard to track down.

That’s a common tactic among people who vandalize Wikipedia pages, a practice the nonprofit has documented extensively. But given the volume of edits that are made on Wikipedia — about 10 per second, with 600 new pages per day — and the fact that Facebook and YouTube are now pulling from them to provide more context to posts, the potential for and effect of abuse is high.

“Of course it is a pretty weak way to combat fake news because Wikipedia is not a reliable source of information — as even Wikipedia acknowledges,” said Magnus Pharao Hansen, a postdoctoral researcher at the University of Copenhagen, in a message to Poynter. “Wikipedia is very vulnerable to hoaxes and contains all kinds of misinformation, so it is not a very serious way to combat the problem of fabricated news.”

Hansen has been editing Wikipedia pages for about 10 years and said vandalism is commonplace on the platform. The number of editors has dwindled in recent years, while online partisans have increasingly targeted the platform — a problem that came to a head during the Gamergate controversy in 2014. It’s essentially a numbers game.

That’s a far cry from where Wikipedia was in 2005, when a study found it was about as accurate as Britannica. And it makes it harder for volunteer editors to combat vandalism at scale on Wikipedia, which doesn’t bode well for using it to combat misinformation on other platforms.

“Certainly there is a lot of wrong information on Wikipedia, and a lot of it is going to stay,” Hansen said. “Oftentimes someone writes an article and it can be years before someone comes along and edits it. People don’t do fact checks unless something looks like it’s really out of place.”

One fabricated Wikipedia article lived on the site for a decade before being deleted in 2015.

At the same time, the platform has proved resistant to the kind of hoaxes that regularly go viral on Facebook and Twitter. Wikimedia estimates that about 2.5 percent of daily edits are vandalism, and Samantha Lien, a communications manager for Wikimedia, pointed to an algorithm that automatically flags questionable edits as a key success for its integrity efforts.

“Articles tend to provide a balanced representation of the facts, and the site has proven resilient to fake news and misinformation up to this point,” she told Poynter in an email. “We’ve also worked with volunteer editors in the past couple of years to build and improve moderation tools to quickly identify and address vandalism on Wikipedia.”

Beyond Wikipedia’s own limitations, tech platforms’ own systems are frequently the subject of mistakes and controversy — especially Knowledge Panels.

Also on May 31, a search for a Donald Trump-supporting Republican senator in North Carolina surfaced an image of her with “bigot” written across the bottom. In January, The Daily Caller found that Google was wrongly appending fact checks to its content and not that of other outlets. That led the tech company to suspend the feature until it worked out the bugs.

Google told Poynter that it doesn’t manually change search results to show preference for one party over another. But its algorithms are regularly gamed by hoaxes, fake ads and trolls seeking to alter search results. And those problems could extend to other platforms linking to Wikipedia.

“When more media platforms use content from Wikipedia uncritically, that does raise the stakes — both for those who would mislead or hoax and for those who would spread knowledge,” Hansen said. “I think, particularly with linking to Wikipedia from YouTube videos about conspiracy theories, that might have the effect of drawing in the wrong crowd.”

In March, YouTube announced that it would link directly to Wikipedia pages in video descriptions in order to provide more context and debunk viral conspiracy theories, which often go viral after breaking news events. Those “information cues” — which were introduced on the platform without Wikipedia’s knowledge — will include a short line about the source and a link to its Wikipedia page.

That feature hasn’t been rolled out yet, but YouTube introduced a similar one in February that provides more context to news organizations that receive money from the U.S. government.

YouTube
(Screenshot from YouTube)

Given the increased visibility of Wikipedia pages, it’s conceivable that vandals could flock to the platform in order to troll multiple platforms with one edit.

“Wikipedia 'hacking' has always been a problem, but it was an issue that only really impacted readers of Wikipedia. Now it's even more dangerous as the platforms are increasingly automatically pulling from Wikipedia,” said Claire Wardle, executive director of First Draft, in an email to Poynter. “For agents of disinformation who want maximum amplification, any technique that ensures platforms or newsrooms will repeat and therefore legitimize a falsehood, provides a disproportionate 'return on investment.’”

In spite of serving as a source for more platforms, Wikipedia hasn’t seen an uptick in vandalism, Lien said. But seeing how the additions caught the organization off-guard, it’s uncertain how exactly they’ll affect Wikipedia pages’ integrity.

“The trend of technology companies using Wikipedia to address issues of misinformation on their own platforms is new territory for us, and we don’t know what the full implications will be,” she said. “In the short term, we’re closely monitoring how this affects Wikipedia and our global community of volunteers. We have not seen anything that would indicate a broader scale issue of vandalism to date.”

But even if there were an increase in Wikipedia vandalism with the intention of altering results on the platforms, Lien said the organization would have no way of knowing.

“We do not have data available that would indicate trends in vandalism with the explicit intent to manipulate other platforms,” she said.

So how likely is it that more misinformation will make its way onto other platforms? Joseph Reagle, an associate communications professor at Northeastern University and a Wikipedia expert, told Poynter while there’s no way to know without the right monitoring systems in place, the structure that’s been set up could make misinformation more visible — even when it’s not created to game platforms like Google and Facebook.

“Yes this is likely, and I think it’s going to become even more likely as these for-profit companies keep jumping on the bandwagon,” he said. “In the Google case, I don’t think it was the intention of the edit to pollute Wikipedia’s downstream users, but that was still the effect. The fact that Wikipedia is used by these downstream aggregators of sorts will also make Wikipedia more of a target, which is a point of concern.”

Still, although it’s shrinking, Wikipedia still has a committed community of editors that have years of experience handling articles about breaking news. Nathan Matias, a postdoctoral research associate at Princeton University, told Poynter in an email that misinformation isn’t likely to frequently slip through the cracks because of that community.

But on the fringes of the site, that chance substantially increases.

“In breaking news cases, where activists of all kinds have developed ways to influence platform algorithms faster than the speed of journalism, I expect that Wikipedia will be more resilient to influence,” he said. “The risks to Wikipedia and to platforms will be greater on the fringes of knowledge and attention, since it's harder to spot a small stream of people accessing low-profile extremist material.”

Despite the challenges, some experts see tech platforms linking to Wikipedia pages as a step forward for countering online misinformation.

Eni Mustafaraj, an assistant professor of computer science at Wellesley College, told Poynter in an email that the addition of Wikipedia to big tech platforms is a positive move. But it comes with caveats.

“They need to do this consistently, and not only to combat misinformation or hoaxes or conspiracy theories,” she said. “In order to combat misinformation, we shouldn't put Facebook and Google in charge, because it was them who exacerbated the problem in the first place.”

At the same time, taking information from Wikipedia off the platform may inherently confuse readers. Hansen said seeing links to the site from social media platforms, even if they’re well-labeled, won’t elicit the same kind of skepticism that is intrinsic in Wikipedia users.

“The reader doesn’t know this is from Wikipedia. That sort of introduces a layer where you can’t be sure about the source of the information,” he said. “People are skeptical when they’re on Wikipedia — you can see when there’s no source … when you’re on a different platform, you’re not alert to that kind of thing.”

Then there’s the question of bandwidth. As editorship continues to decline, Wikipedia might want to consider hiring full-time employees whose job it is to monitor vandalism and ensure it doesn’t spread to other platforms. Hansen said he thinks the nonprofit should create an in-house fact-checking unit or solicit expert editors and academics to weigh in on some of the more nuanced pages.

“I think that is the only way to combat disinformation on Wikipedia,” he said.

At the very least, Lien suggested that tech platforms that use Wikipedia to counter misinformation give back to the nonprofit in some way — a point that Wikimedia Executive Director Katherine Maher argued in a recent article for Wired. (Facebook and YouTube aren’t included among Wikimedia’s top benefactors.)

“We encourage companies who use Wikipedia’s content to give back in that same spirit of sustainability,” she said. “In doing so, they would join the millions of individuals who chip in to keep Wikipedia strong and thriving.”

And Matias agreed.

“If companies expect to rely on massive amounts of unpaid and often psychologically-burdensome labor from Wikipedians and the Wikimedia Foundation to address some of their toughest problems, they need to find ways to support that work while protecting Wikipedia's intellectual independence,” he said.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News