Wikipedia is more than a publication. As I point out in Living in Information, Wikipedia is also the place where this publication is created. At its scale, it couldn’t happen otherwise. But Wikipedia is more than that: increasingly, it’s also a key part of our society’s information infrastructure. Other systems increasingly rely on it for the “authoritative” versions of particular concepts.
This works well most of the time. But it’s not perfect, and can lead to weird, unexpected consequences. For example, a Wikipedia entry is part of the reason why Google says I’m dead. More recently, a Wikipedia hack led to Siri showing a photo of a penis whenever a user asked about Donald Trump. While the former example is probably due to bad algorithms on Google’s part, the latter seems to be a fault with Wikipedia’s security mechanisms.
The people who manage Wikipedia are in an interesting situation. Over time they’ve created a fantastic system that allows for the efficient creation of organized content from the bottom-up at tremendous scale. They’ve been incredibly successful. Alas, with success comes visibility and influence. The more systems there are that depend on Wikipedia content, the more of a target it becomes for malicious actors.
This will require that the team re-think some of the openness and flexibility of the system in favor of more top-down control. How will this scale? Who will have a say on content decisions? How will Wikipedia’s governance structures evolve? These discussions are playing out right now. Wikipedia is a harbinger of future large-scale generative information environments, so it behooves us all to follow along.