Wikipedia expands its ranks of editors with vandalism-sniffing AI

Promoted in the past by Stephen Colbert for comedic purposes, acts of Wikipedia “vandalism” are no laughing matter to the website, and a new automated tool should make it easier to sniff out false and damaging edits to content.

Hailed as a fairly reliable resource, Wikipedia allows anyone to make changes to its encyclopedia-style content. According to a post on the Wikimedia Foundation’s official blog, the site is inundated with around 500,000 changes to articles every day.

Now, software called the Objective Revision Evaluation Service (ORES) will automatically scan user edits for telltale language representative of cynical intentions. The Wikimedia post said the software will make it easier for both its employees and ordinary users to see potentially damaging edits.

“This allows editors to triage them from the torrent of new edits and review them with increased scrutiny,” the post said.

The foundation said it has been assessing the system for a few months and over a dozen editing tools and services are currently using it.

“We’re beating the state of the art in the accuracy of our predictions,” the post said. “The service is online right now and it is ready for your experimentation.”

Not Wikipedia’s first rodeo

This isn’t the first AI tool launched to improve the quality of Wikipedia, but these past effort have hit significant snags. For example, some tools have made it more arduous for new users to submit content or make edits.

“These tools encourage the rejection of all new editors’ changes as though they were made in bad faith,” Wikimedia said referring to past efforts, “and that type of response is hard on people trying to get involved in the movement. Our research shows that the retention rate of good-faith new editors took a nosedive when these quality control tools were introduced to Wikipedia.”

The foundation said ORES tries to avoid this problem by looking solely at the language used.

“The thing to note is it doesn’t judge whether the facts that people are adding are actually true, because fact-checking is immensely difficult, it’s looking at the quality,” John Carroll, a computational linguist at the University of Sussex, commented to the BBC. “It should help a great deal with Wikipedia.”

—–

Feature Image: Thinkstock