Executive summary; MediaWiki software can be a significant and over-looked vulnerability on the web.
There has been remarkably little reporting on the effects and influence of MediaWikis on the web. I have detailed an eight year case study showing how various troll farms exploit these vulnerabilities and many times influencing the broader web culture and mainstream.
Much of my early research into Wikiwars on Wikipedia began right before “GamerGate” and continued for seven years following it, as I traversed down a rabbit hole of dark MediaWiki communities spanning multiple Wikipedia style encyclopedias that were leveraged to harass and assault people on the internet.
Gamergate itself was a cultural tipping point, but as a battle that played out significantly on Wikipedia, it initiated a cultural battle against “mainstream media sources” that continued to grow as it amplified into the alt right and the “dark enlightenment”.
Online culture has not been the same sense.
While much could be said about placing blame on a host of toxic factors and personalities, the primarily culprit in most of this was simply the MediaWiki software––a system of policing and governance that promotes competition while the “Five Pillars” of Wikipedia promotes collaboration.
MediaWiki “anyone can edit” software and platforms, starting with Wikipedia and evolving into hundreds of MediaWiki’s around the internet, can become significant vulnerabilities on the world wide web when it comes to misinformation, disinformation and harassment.
How many problems happen daily within MediaWiki software? Problems with MediaWikis are not possible to measure, there simply is no way to account for the in numerous opportunities and all the possible exploitations of the system that are easily taken advantage of by any misaligned agent.
As revealed in the Netflix documentary film The Social Dilemma by members of the Center for Humane Technology Tristan Harris, Azra Raskin, and Roger McName, its now mainstream awareness that Facebook, Instagram, Twitter and Reddit, are far worse than we could ever imagine a utopian like community of Wikipedia to ever be. When we think of misinformation and harassment, Wikipedia or MediaWikis are the last thing anyone would think of.
Hiding this problem even further, the mass adoption of MediaWikis come in the form of readership, not community of editors, like their social media counterparts of Twitter and Facebook.
Those who actually collaborate and create media on MediaWikis are such a small number that they would not even make a blip on any social media platform, and any platform in today’s world that would have such a small adoption level of creators would simply not be able to survive and would be insignificant.
In the public sentiment of Wikipedia, we tend to view things like Wikipedia in terms of how we use it as an audience, not a creator.
Wikipedia is still incredible for research, I still use it quite often, and of course there are no ads. Wikipedia does not target their audience, or sell their data. What could be a better social platform in today’s world?
This optimistic perception of Wikipedia, which I deeply had for a decade, is easy to collapse once anyone were to actually attempt to participate in the construction of a controversial Wikipedia article.
I am still a big fan of Wikipedia, as an ideal. I am profoundly supportive of Wikipedia’s “Five Pillars” and all of it’s collaborative message.
The core problem with Wikipedia, as my core research showed me, painfully, was that the software, MediaWiki, was the problem. For while the communities rules of engagement are collaborative, the software breeds competition.
MediaWiki and policing powers
The competition gets fierce because MediaWiki tools give the ability for some users to “police” other users, anonymously.
Twenty years ago, Wikipedia’s open and optimistic Web 1.0 collaborative message of the encyclopedia that “anyone could edit” has arguably produced one of the great wonders of the world.
“Everyone can edit, not everyone should” however, is the platform reality.
Who should not edit, and who determines?
Wikipedia has a governance system that decides who can, and who cannot edit, and this court system even has a “supreme court”, which creates a pretty sophisticated level of authority and power, its own political reality, with norms and unspoken rules.
The tools to enforce those decisions are given to some users, and over time those same editors begin to “build a career” through the ranks of the Wikipedia admin system. These are users that slowly over time “prove” themselves to the community that they can be trusted with more and more power to police and influence the platform.
This competition turns into a battle over who ultimately control an article, “battles to the death for very low stakes”, as noted by Wikipedia veteran David Gerard.
If anyone reading this who has also never attempted to edit a Wikipedia article, try it. Wikipedia’s first line of defense is against new editors, and anyone who attempts to quickly spread spam or misinformation on their first pass is not likely to have success.
It takes time to learn how to “game” Wikipedia. To do so requires a “long-tail” strategy of building community support with the culture of editors.
That takes time for any influence agent to do.
But Wikipedia is twenty years old. Influence teams easily could have started playing the long-tail strategy long ago, plenty of time to have moved up into the ranks.
So Wikipedia’s digital governance could be anyone, any college student, any government agency, foreign or domestic, any political operative.
Comparing this culture to “The Lord of the Flies” has occurred on so many numerous occasions I would rather avoid the redundancy and not use it to describe the bizarre experience of this peculiar ambitious and proud online social culture, again. But the comparison is so perfect.
How so? The quest for the conch. The quest for the control of editing permissions, and the editing permissions against other users. The tools to suppress. If everyone has the same tools, no problem. The fact that users hyper-game the system to obtain those tools by engaging in some of the most elaborative and psychologically abusive manipulations the internet has ever witnessed and then use those same tools to bolster their own social presence at the expense of others is how Wikipedia realistically operates itself.
Therefore, the “power” to obtain permissions to police other users becomes an ambition, a motivating force on Wikipedia, whose every single rule dictates the opposite which is “good faith” collaboration.
What makes this incredibly vulnerable if not intractable is that all of these Wikipedia editors who have policing power are all anonymous. It could be anyone.
While there is no doubt that Wikipedia has many incredible contributors, researchers, University students and projects, there is no way to ever have any transparent accounting here. There is no respectable publisher in the world that would allow it’s editorial staff to be anonymous.
So while a Wikipedia editor could be a diligent university student, it could also be any influence operator, corporation or lobby, any type of government agency, foreign or domestic, any type of ideologically driven organization.
There is no way to tell. The community governs itself anonymously, with no oversight from the WikiMedia Foundation.
The entire trajectory of managing the worlds largest encyclopedia are in the hands of this community which, my research shows, can become highly toxic without any oversight, and worse, in a manner that is intractable.
Facebook has possible solutions, Twitter has possible solutions.
What are the possible solutions to the MediaWiki dilemma? There are none. The entire system of MediaWiki is an intractable problem on the web, and in my view––this makes this problem a bit more concerning.
Canary in a coal mine
Amplifying the MediaWiki problem is big tech itself.
Big tech companies like YouTube and Google, having poor tools of their own to handle online misinformation, now lean on Wikipedia for fact checking and reliability in the age of misinformation.
Because of that, Wikipedia articles are likely to become targeted by more sophisticated influencers whom understand that overtime, they can weld that sort of power and influence fact-checking itself, increasing the social competition within the editing communities.
The fact that big tech would look to MediaWiki platforms to be the solution to the misinformation problem they are amplifying should be of the biggest concern in the world, for if this is the “hope” to solve big tech’s problem, the emperor is truly naked, there are no solutions.
Other digital communities that face online misinformation and harassment, Facebook, Twitter, etc, could potentially have technical solutions to those problems or system oversight if abuses happen.
WikiMedia Foundation, the parent non-profit company, has no oversight to enforce any platform or community changes to the digital governance architecture of Wikipedia, that is up to an anonymous community, composed of anyone, to decide.
Due to this flaw and the flaw of the parent relationship to the oversight of MediaWiki software in the community, this means MediaWiki’s are a remarkable vulnerability on the internet to the spread of online misinformation in a manner which in many cases is intractable.
With WikiMedia, the problem is baked in. The WikiMedia problem is permanent as long as MediaWiki’s govern broadcast of the world’s largest encyclopedia.
Access the case study here.