Wikipedia is now so good that we don’t tend to think about how good it is. It’s just there, and it feels as if it’s always been there. “Oh yes, the vast free repository of human knowledge, what about it?” But Wikipedia is remarkable. Astonishing, really. It’s built on a model that exists almost nowhere else online. It’s like nothing else in existence. And it gives us a bit of insight into how we might reform other platforms, even society itself.

First, let’s remember just how “top-down” almost all of the largest web services are. Google, Facebook, Amazon, Twitter. Each a multi-billion dollar company, each run for profit and owned by private investors, each controlled by a powerful CEO. Facebook, Twitter, and Google make money almost entirely through advertising: Companies pay them to put products in front of users’ eyeballs, and the platforms alter the user experience accordingly. Each of these companies operates exactly the way you’d expect of a corporation seeking monopoly power. They crush tiny competitors, they buy politicians (Google and Facebook give more money to Republicans than Democrats), and they are extremely secretive about their internal decision-making process. They do not tell you the algorithms that determine what they will show you, or the experiments they are using to figure out how to manipulate users’ psychology. (Facebook had a brief scandal in 2014 when it was revealed to have tested different ways to mess with people’s emotions, seeing if it could bump users toward happiness or sadness with the display of positive or negative news. Over 700,000 news feeds had been tampered with. It also, even more creepily, kept track of status updates that people had typed and deleted without posting.)

There is nothing even beginning to resemble “democracy” at these companies. Like most corporations, they are dictatorships internally. Amazon workers are infamously mistreated, and Jeff Bezos’ wealth increases by $215 million per day, meaning that he makes the median annual Amazon salary once every nine seconds. Amazon workers have had to fight hard just to reach $15 an hour base pay, which in many places is still far short of a living wage.

Users do not have any voice in company policy. Their only input comes in the form of a binary market choice: Use the service or don’t. If you don’t like the way Facebook’s ads work, go find some other social media network. But because there is no other social media network like Facebook, there is no online shopping portal like Amazon, and there is no bottomless pit of short-form blather like Twitter, it’s not clear where else there is to go. You can get the internet out of your life completely, which might be good for many people’s mental health. But some of us actually rely on these companies for our livelihoods. Current Affairs articles, for instance, are distributed primarily through Facebook and Twitter shares. If these companies were to, say, block our accounts, or even bump our material downward so that fewer users saw it, we’d take a significant hit to our revenue. And if we did incur the ire of one of the tech monoliths, if Amazon stopped selling our books or Google stopped displaying our pages in search results, there would be absolutely nothing we could do about it. They don’t have to give us an appeals process. Nobody votes on it. If Mark Zuckerberg wakes up one day and decides he’d like to put Current Affairs out of business, he could probably do it. We are reliant entirely on maintaining the good will of a benevolent overlord.

Illustration by Naomi Ushiyama

Wikipedia is something else. It has no advertisements, it seeks no profits, it has no shareholders. It’s incredible to think of the amount of money Wikipedia has given up by steadfastly refusing to publish even the most unobtrusive promotions. In the early days, when there was still a live debate about whether the site should have ads, even just ads for nonprofits, there were those who thought it insane to insist on keeping the site absolutely pure. And yet the purists won.

Being a nonprofit among the profit-seeking monopolies distinguishes Wikipedia. But what makes it like absolutely nothing else in the world is its governance structure. It’s a genuine democratic platform, its rules controlled by its users. There is nothing else quite like that anywhere.

Let’s consider the radicalism of the Wikipedia model. It’s a “free encyclopedia that anyone can edit,” as we know. It has well over 5 million articles in English (40 million total in 301 languages), all of which are put together through the collective effort of volunteers. Readers write a paragraph here, fix a date there, add a citation or two, and over time a vast compendium of human knowledge emerges. It has been stunningly successful, and is one of the most visited sites on the web, with over 18 billion page views.

But Wikipedia is not just edited by users. Its policies themselves are stored in wiki pages, and can be modified and updated by user-editors. The governance of the site itself, the processes that determine what you see, are open to revision by the Wikipedia community, a community that anyone can join. Not only that, but every change to Wikipedia is transparent: Its changes, and the debates over them, are fully available in a public record.

One of Wikipedia’s core rules is: “Wikipedia has no firm rules.” That does not mean “anything goes.” It means “the rules are principles, not laws” and they “exist only as rough approximations of their underlying principles.” But the ethic of Wikipedia is that everything is subject to revision, open to discussion, and that anyone can discuss it.

This has meant that Wikipedians have had to, over time, figure out how to govern themselves. Political philosophers have long been infatuated with the concept of the “state of nature,” the condition humankind would find itself in before it had designed governing institutions, and much political theory is concerned with examining how the people in this hypothetical world should construct a state. Or, if humankind suddenly finds itself stranded on a desert island, what procedures ought we to set up to keep everybody from eating each other? The course of Wikipedia’s development has been one of the few real-world examples of such a scenario.

The site began chaotically; in its early days, it was frequently criticized for the presence of misinformation. In an incident that became notorious, journalist John Seigenthaler wrote about the problems with his own Wikipedia entry, which falsely implied that he had been a suspect in the assassination of John F. Kennedy. Its editorial discussions were a mess, too. It wasn’t clear how the users should resolve disputes about what to include and what not to include, or how to make an article reflect a “neutral point of view.” (Some right-wing activists were so disgusted with the site’s supposed bias that they founded the much-derided “Conservapedia,” which promised it would be “free of corruption by liberal untruths.” Conservapedia on Harry Potter: “[A]t Hogwarts, chapel is conspicuously absent. A failure to mention Christianity, combined with the presence of wizardry, have led some to wonder whether Rowling is substituting paganism for Christianity.”)

Over time, however, Wikipedians sorted themselves out. They developed their practices. They figured out how to stop trolls and vandals. They use consensus-based decision-making: What goes into articles is not the result of a simple up-or-down poll among editors, but comes out of a discussion process whereby the editors are supposed to work together to reach compromise. They established procedures for adjudicating disputes, including some hierarchies about who could decide what and who would have the final say. If the community couldn’t sort out a disagreement itself, the dispute would go to an “arbitration committee” elected by the volunteer editors, and editors would each make their case to the committee.

Watching editorial discussions on Wikipedia can be fascinating. The community has developed a code of conduct (Wikiquette) designed to encourage collegial and productive collaboration, and prevent disagreement from descending into acrimony:

Respect your fellow Wikipedians, even when you disagree. Apply Wikipedia etiquette, and don’t engage in personal attacks. Seek consensus, avoid edit wars, and never disrupt Wikipedia to illustrate a point. Act in good faith, and assume good faith on the part of others. Be open and welcoming to newcomers. Should conflicts arise, discuss them calmly on the appropriate talk pages, follow dispute resolution procedures, and consider that there are 5,716,660 other articles on the English Wikipedia to improve and discuss.

Some aspects of Wikipedia’s culture are almost sickening in their loveliness. Users give each other awards called “barnstars.” Anyone can create one of these awards, and give one to anyone else.  Some examples:

The Ray of Sunshine — “bestowed on that person that, when you see their name at the top of your watchlist, you know that all is right with the world”

The Resist Hivethink Award — for those who buck consensus in productive ways

The Good Friend Award — “for people who have helped new/inexperienced Wikipedians to successfully create their first article”

The Resilient Barnstar — “given to any editor who learns and improves from criticisms, never lets mistakes or blunders impede their growth as Wikipedians, or has the ability to recover/finish with a smile.”

There is even an award given to people who have been treated unfairly by others, as a way of saying sorry and encouraging the community to be nice to them. It all makes for a thought-provoking prototype for what a non-monetary system of social rewards might look like in a post-capitalist society. (I wonder if any teachers have tried this kind of “anyone can give anyone else an award at any time” system with their kids.)

It is not all peace and love and rays of sunshine at Wikipedia, though. Things get very contentious—predictably, the “talk” page for the article “Gaza Strip” contains a heated dispute over use of the term “military occupation.” But it mostly gets figured out. Here, for instance, see some excerpts from a debate over whether the lede for the “Donald Trump” article should contain the sentence: “Trump first stirred controversy in Republican politics over his promotion of birther conspiracy theories”:

Please indicate whether you support or oppose something similar to the above text, along with your reasoning.

Oppose — Insignificant for lede – was less significant than his reality TV and other activities at the time, and definitely insignificant in relation to being elected.

Support the idea, but oppose the wording on the basis the sentence is awkward to parse.

Support — Most political science and history treatments of Trump’s involvement in politics notes that the birtherism is an important and noteworthy initial stepping stone for Trump, bringing him great prominence.

Support — Very relevant to Trump’s political career, much more so than reality TV, for example.

Oppose — Undue weight in my view for the lede itself. Not a significantly major event in Trump’s life and the text in the main body of the article is sufficient in my opinion.

Today, the sentence does not currently appear in the article’s lede, though the body text discusses Trump’s “birtherism” in some detail.

The operations of the “arbitration committee” can also be fascinating to observe. It operates a little like a court: Users request the arbitration committee get involved, the committee agrees to hear a case, the evidence from both sides is presented, and the committee votes and issues a final decision. When I looked, there was a dispute over whether an editor who described another as “pushy” had, among various other offenses, been engaging in a “personal attack” in violation of the rules.

But the procedures and hierarchies Wikipedia has developed can also be somewhat complicated. The arbitration guidelines run for pages and pages. Bureaucracy always tends to spread like a fungus, and some have argued that as Wikipedia has developed its guidelines to deal with every scenario, the site has become unwieldy. The Wikipedia community has its own specialized language, and you need to know a number of terms that are more familiar to programmers than the rest of us (the arbitration pages discussed the presentation of “diffs,” a computing term which here refers to color-coded pages displaying the differences between one version of an article and another). There’s a learning curve, and that means that only certain types of people are going to want to become Wikipedia editors. One criticism that has been made is that even though theoretically “anyone” can edit the encyclopedia, in practice only a small number of people make a large percentage of the edits.

I’m not sure how valid that is as a criticism. After all, any volunteer project like this is probably going to rely heavily on “hardcore” contributors. But it does make for a problem of bias: Wikipedia’s users are disproportionately white and male, which means that the articles can tend to reflect their interests (good lord, you should see how in-depth the ones on video games are). Wikipedia necessarily reflects certain wider social inequities; working moms have less time to contribute to unpaid editing work on enyclopedias, so working moms have a lesser part in the collective “voice.” The “unpaid” nature of the work raises some problems, since it’s going to mean that those with money and leisure time get to build the thing, and determine what goes into the Definitive Compendium of Human Knowledge. On the other hand, Wikipedia is far less “elitist” than any other encyclopedia ever created: You don’t have to have any credentials or connections to participate. You just have to know what you’re talking about. The barriers to access are still present, but the level of openness has no parallel elsewhere.

On the whole, it’s impressive how close Wikipedia has come to achieving its stated goal of “viewpoint neutrality.” The articles are, for the most part, reliable and trustworthy. I’d defer to Wikipedia over any other encyclopedia, in part because Wikipedia is so transparent: I can see all of the sources, and all of the discussion and changes that have led to the article existing in its present form.

Wikipedia’s unique participatory model has actually allowed it to escape the kinds of scandals that have eroded public confidence in other platforms. Whereas Twitter and Facebook have been criticized for allowing “fake news” to proliferate, and for poor decisions in deciding which content to remove, Wikipedia hasn’t had major public embarrassments for a long time. Facebook has run into trouble for decisions like removing breastfeeding pictures and war photography as part of its “anti-nudity” policy, and has only reversed course after significant public pressure. Wikipedia, too, makes content moderation decisions on a daily basis, but the reason you don’t hear about them is that the arguments are resolved through the site’s own processes.

If you object to something Facebook does, you cannot change it yourself. You cannot even ask Facebook to change it; they’re very unlikely to amend corporate policy on the basis of what one person tells them. You will have to, instead, publicly campaign about the change, and hope Facebook listens. If your campaign gets enough attention—as the outrage over breastfeeding photos did—then the company may be sufficiently embarrassed and reverse course. But you will never know what went into the decision one way or the other. It will all be opaque. Nothing is opaque on Wikipedia. There are transcripts. Records. Everything is hashed out in the “public square.”

Decisions about moderation are “value-laden.” How do you decide what constitutes “harassment”? Should racist speech be permitted, and who should determine what racist speech is? If one person is offended by nudity, and another person isn’t, should our platform screen out nudity? At the major tech companies, these decisions are made by unelected groups of Silicon Valley executives. Mark Zuckerberg has repeatedly said that he doesn’t believe in the right to privacy. So Facebook doesn’t respect people’s privacy. Judgments do not reflect the collective will, but the preferences of whichever rich dork happens to rule the company.

Wikipedia shows us that this doesn’t have to be the case. Participatory governance can work. It has problems, but those problems are for the community itself to work out. Imagine if our social networks were run on a “Wikipedia model.” What if Facebook’s terms of service, its newsfeed algorithms, its features and options, were all determined through the deliberation of users themselves? You might think this would be a disaster; many people fear democracy and think rule by philosopher-kings is preferable. I think the problems of Facebook, Twitter, Google, and Amazon, contrasted with the success of Wikipedia, show that the opposite is true. It’s rule by kings that is the disaster, because kings never actually understand what’s good for the community as well as the community itself does.

There are also severe dangers that come with concentrating power in the hands of an unelected minority. Facebook’s algorithms are completely secret, which means that we don’t know when and how we’re being manipulated. If the company wished to, it could send voter registration reminders to only those users from one political party. It could inflate or deflate the significance of a news event, make or break a media company. And these possibilities are not just theoretical: Last year, Facebook decided that too much news was making people unhappy, and was causing them to spend less time on the service. So news articles got de-prioritized in the feed, while dog and baby photos were elevated. Many people probably didn’t even notice the change. Media outlets noticed, though. Small tweaks mean big changes in traffic and revenue.

Obviously, it’s hard to imagine everything running like Wikipedia runs. For one thing, Wikipedia requires only a very small paid staff, because it doesn’t have much real-world infrastructure. Amazon, on the other hand, has half a million employees. But participation is a principle: Applied to Amazon, it might mean that all workplace policies are set by employees. Managers can be voted on and recalled by workers. It might mean something else. Wikipedia shows one version of an adaptable ideal: The people who are affected by decisions ought to be the ones making them.

Wikipedians are firmly insistent that the site is not a “democracy,” because it doesn’t defer to “majority rule” and instead uses a consensus process. I don’t agree with them. They’re a democracy, whether they like the term or not. Democracy is commonly misunderstood as mob rule, but it isn’t. Democracy is about participation in power, and Wikipedia’s experiment is the closest thing we have to a complete open-participation model. It is a little outpost of communism in the brutal capitalistic world of Silicon Valley: Nobody owns it, everybody is equal, there is no money exchanged, advertising is banned, and people do things because they like doing them, rather than because they have to or because they’re paid for it. These values are completely absent from the other major information channels on the internet. But they’re good values. They should spread. A world run like Wikipedia would be a wonderful world indeed.

This article originally appeared in Issue 15 of Current Affairs. Subscribe today or get single issues in our online store!

If you appreciate our work, please consider making a donation, purchasing a subscription, or supporting our podcast on Patreon. Current Affairs is not for profit and carries no outside advertising. We are an independent media institution funded entirely by subscribers and small donors, and we depend on you in order to continue to produce high-quality work.