Bots vs Wikipedians: How Wikipedia Benefits From Bots

wikipedia

Nearly half of all edits to Wikipedia are made by automated computers rather than humans according to a new study. But the bot edits are mainly on the newer language-based versions and are largely constructive.

The topic is covered in a paper titled “Bots vs Wikipedians, Anons vs Logged-Ins” by post-doctoral student and Google employee Thomas Steiner.

He developed an API that could monitor Wikipedia edits in real time. Unlike some similar studies, he covered all 287 different versions of Wikipedia, some of which have just a handful of articles.

The study covered a total of 3.8 million edits over a two-day period last November. During that time 260 of the versions had at least one edit.

Steiner only ran the study to prove that the API itself would work, but he made it publicly available. Newsweek has now used the API to run a four day study.

It found that of all edits, 46 percent were made by bots. However, while virtually every edit on the Vietnam edition was by bots, only five percent of edits on the English-language edition were by our automated friends.

The main reason for that difference is that automated edits are most useful for newer editions of Wikipedia that are still missing what might be considered core information. Newsweek notes that bots can take care of tedious tasks such as using census data to add detail to entries about geographic areas, or even detailing asteroids using NASA data.

With more established editions, bots are more likely to be an anti-vandalism tool, though they also play a role in automatically fixing common spelling errors and typos. According to R Stuart Geiger, a computer science graduate quoted by Newsweek, such bots can’t eradicate vandalism but can keep it below the tipping point at which human editors would decide it simply wasn’t worthwhile to continue working on the site.