The Untold Truth Of Wikipedia

Wikipedia is everywhere. To borrow a phrase, it surrounds us and penetrates us. It binds the galaxy together. Okay, that might be hyperbole, but nearly 20 years after its founding, it's hard to imagine a world without Wikipedia. Just about every article, term paper, slide deck, and angry letter to an editor begins with a quick visit to the online, crowdsourced encyclopedia. And everyone has lost a few hours here and there to the various rabbit holes provided by Wikipedia's stranger pages.

Despite its ubiquity and fame, you probably don't know much about Wikipedia's history and secretive culture. After two decades of evolution, infighting, controversy, and more than six million pages in English alone, Wikipedia is one of the few big ideas from the early days of the Internet that hasn't just survived but has actually thrived. (It's the only nonprofit website in the top ten in terms of search traffic.) Once you go down that rabbit hole, you find that the untold truth of Wikipedia is probably just as interesting as any page you can find on the site itself.

Wikipedia was originally more like a traditional encyclopedia

The Wikipedia we all know and love is a crowdsourced collection of knowledge, with an army of volunteers writing, editing, and constantly debating the information contained within its virtual pages. The site is actually kind of inspiring, an example of people donating their time and expertise to make the world a better place.

But that actually wasn't the original intention. As Entrepreneur explains, Jimmy Wales and Larry Sanger originally launched something called Nupedia, which was intended to be a peer-reviewed online encyclopedia written by paid experts, similarly to the way traditional encyclopedias were developed. (Back in 2005, Slashdot posted what Sanger wrote about it: "Nupedia was to be a highly reliable, peer-reviewed resource that fully appreciated and employed the efforts of subject area experts, as well as the general public.") When work slowed to a crawl, they hit upon the idea of a wiki to source content, but that was initially intended to be a supplement to the expert-written material.

As Wikipedia itself notes, there was a lot of resistance to the idea of combining a traditional approach with this newfangled wiki concept. As a result, Wikipedia was split off to be a separate entity in order to keep Nupedia's reputation clean — and quickly became much more relevant and successful than its parent site, which shut down for good in 2003.

One of Wikipedia's co-founders left almost immediately

Larry Sanger was Jimmy Wales' partner when they started working on Nupedia, the predecessor to Wikipedia. Nupedia was supposed to be more like a traditional encyclopedia, with articles written by paid experts. Nupedia began in 1999, but the process of creating and reviewing articles was extremely cumbersome and slow — as Slashdot notes, by 2001, fewer than 24 articles had been approved.

Wikipedia was launched that year and was originally intended to help with this, and Sanger was in charge of organizing the new site and developing its processes. Sanger says that, "When the more free-wheeling Wikipedia took off, Nupedia was left to wither," and he believes it could have been salvaged if it hadn't been starved of resources as a result of Wikipedia's success.

Sanger left both projects in 2002. As The Independent reports, Sanger felt the site was quickly "overrun by trolls" and became a case of "inmates running the asylum." When Wikipedia began to take off, Sanger tried to impose a more formal and regimented system for writing and approving articles, but, as he later told Vice, "by the time the new recruits arrived — the anarchist crowd, as I called it at the time — all that stuff became deeply unpopular."

Wikipedia has had two failed competitors

If there's one absolute rule of the Internet, it's that success breeds imitators, so it's not surprising to learn that there have been attempts to create competing online encyclopedias.

The first was launched by one of Wikipedia's co-founders, Larry Sanger, who left the project shortly after it launched due to a fundamental disagreement with its structure and processes. Sanger had been originally brought in to work on Nupedia, Wikipedia's predecessor that was intended to be like a traditional encyclopedia. When Wikipedia, originally intended to supplement Nupedia, began to take off, Sanger left. As Wired reports, he immediately launched a site called Citizendium. Citizendium was modeled on Nupedia's concept of a "partnership" between paid experts and amateur volunteers but struggled to find those volunteers as well as steady funding. The site is still up and running, but it doesn't get much traffic.

Another competitor, Scholarpedia, was launched to be a "peer-reviewed open-access encyclopedia" with a more academic slant, but Wired notes that it has fewer than 2,000 articles. Both sites ran up against the same problem that plagued Nupedia: Experts like to be paid for their expertise and are reluctant to let nonexpert "volunteers" assist or review their work in any way. Plus, of course, Wikipedia exists, so convincing folks to visit another wiki entirely is always going to be an uphill battle.

Wikipedia is no longer considered unreliable

For a very long time, Wikipedia was considered a bit of a joke in terms of accuracy and reliability. We don't want to imply that Big Encyclopedia launched a whisper campaign against the site ... but they totally did. Students were told not to cite Wikipedia, and millions of people everywhere spent long hours tracking down more "legitimate" sources for information they found there.

It's easy to understand why this would be the case. In theory, literally anyone can edit a Wikipedia page, which invites a lot of temptation to vandalize pages you don't like or to sift disinformation and propaganda throughout the site. And for years, that was given as the main reason not to trust anything you read on Wikipedia.

But that's changing. As CBS News reports, Wikipedia is now considered one of the more reliable sources of information, especially when it comes to real-time information about developing events. And scientists and academics are even publishing papers asking point-blank why Wikipedia isn't getting more respect. This is in part due to a dawning realization that everything that supposedly made Wikipedia inaccurate — the crowdsourcing and the ability for anyone to edit pages — is actually a strength. Errors and disinformation tend to be reversed very quickly, because someone is always watching, and if you click on the "Talk" tab of a Wikipedia page, you're likely to find a robust and detailed discussion regarding the smallest details of that subject.

One guy is behind one third of Wikipedia

Wikipedia is the site where anyone can add to or make a change to the content. While that's technically true (as The Atlantic notes, there are 8.6 million username editors who've made at least one change to an article), the reality is that a small number of very active people create and revise most of the content. Vice found that 77 percent of Wikipedia articles are written by just one percent of the editors on the site, which is kind of incredible, considering that the English version of the site now has more than six million articles.

But it's actually even less diverse than that. As CBS News reports, one man is actually responsible in one way or another for about one third of all the content. Steven Pruitt has written an astounding 35,000 original articles, and — equally impressive — he's made more than three million edits to existing pages.

Since he doesn't get paid for all this work, that might seem like the definition of having too much time on your hands. But aside from enjoying the work and getting personal satisfaction from it, there are some perks. Time Magazine named Pruitt to its list of 25 most influential people on the Internet, for example, alongside Donald Trump and Kim Kardashian West.

Wikipedia has some deeply weird policies

If all you've ever done with Wikipedia is surf some pages, you're missing out on one of the weirdest and most passionate cultures on the Internet.  According to The Atlantic, there are 8.6 million username editors who have made at least one edit. That's a huge population to manage, and Wikipedia's policy statements are a throwback to the early days of the Internet, when things weren't nearly as corporate and buttoned-up as they are today.

As Wired reports, Wikipedia has plenty of policies covering some pretty weird scenarios — and those policies often have ridiculous names. A great example is No Angry Mastadons, a policy covering how to deescalate disagreements and avoid a "fight-or-flight" response when people who are just as passionate about a subject as you disagree with you. Then there's No Climbing the Reichstag Dressed as Spider-Man, which discourages editors from making big, dramatic scenes in order to promote their side of an editing dispute, or No Curses, which explicitly asks that no one place any curses or magic hexes on their perceived content enemies.

While some of this might seem silly, it speaks to Wikipedia's freewheeling culture. Best of all, these policy pages are, like everything else on Wikipedia, editable by anyone.

Wikipedia's policies are extremely complex

Like any large community, Wikipedia can be a complicated and unwieldy place. In order for it to function, there have to be rules — a lot of rules.

As Wired reports, of the nearly nine million editors, only about 1,100 actually have administrative status, with the ability to overrule or ban people or lock pages down. More importantly, the policies that Wikipedia has developed have grown into a complex and intricate system involving more than 150,000 words — the length of several novels. Like any body of written laws and rules, there's a lot of interpretation involved, so if you ever intend to get into a debate over how Wikipedia does something, you're going to have to be something of an Internet lawyer and bone up on the minutiae of Wikipedia Law.

And it's only getting more complex as time goes on. As The Verge reported, Wikipedia is writing new policies to strengthen its harassment policies, and Slate notes that the site's increasing use as a "breaking news" source of information is putting many of these policies to the ultimate test and resulting in even more edits and additions, which will make the rules even more difficult to master.

Wikipedia is locked in an endless war

The whole point of Wikipedia is that anyone can write or edit an article. But those articles and edits are also constantly being reviewed by the other editors and administrators on the site, which leads to some pages changing on a fairly constant basis. This ranges from mild disagreements over word choice to violent battles over facts and sources.

It also means that organizations and individuals will forever seek to change what their Wikipedia page says. This goes straight to the top: As Time Magazine reports, in 2014, Wikipedia placed a ten-day ban on IP addresses originating from the United States Congress because of the volume and nature of the edits being made. According to Reuters, the CIA and the FBI have both also been caught making changes to Wikipedia's pages.

Corporations have also tried to subvert Wikipedia for their own sordid advertising goals. As The Verge reports, in 2017, Burger King aired a commercial that tried to prompt people's voice assistants to read out a description of the Whopper from Wikipedia. That's creepy enough, but Burger King also edited the entry just before the commercial aired so that it was much more marketing-friendly. Wikipedia reverted the changes quickly, but the war against what it calls "conflict-of-interest editing" continues.

Wikipedia had a corruption scandal

You might not think of Wikipedia as a place to make unethical monetary gains. A place for harmless Internet pranks? Sure. A place for pedantic arguments over obscure interests? Absolutely. But it's not a place you'd expect to read about shakedowns and shady business practices. In fact, the site's co-founder, Jimmy Wales, has insisted that the site be not-for-profit and regularly begs for donations so they don't have to take advertising that might influence their article curation.

But that doesn't stop people from trying to make money off of Wikipedia. According to Business Insider, two Wikipedia employees used to have a side project: They ran a public relations firm. That's not a problem in and of itself, but part of the service they offered was changing their clients' pages in order to eliminate negative information. They apparently referred to this service as "sanitizing reviews."

Inc.com reports that this led to changes in Wikipedia's policies which required editors to divulge conflicts of interest, and it launched a campaign to identify editors who were being paid to make changes on the site. The Verge notes that the site also launched a huge investigation into so-called "sockpuppet" techniques, leading it to send a cease-and-desist letter to a Texas PR firm that was changing Wikipedia pages to benefit their clients.

Wikipedia has a diversity problem

The Internet is a raucous, crowded place — but it's also largely faceless and often anonymous. When we dip into Wikipedia to get some quick info, or even when we spend hours there researching, we don't often consider who's behind all that info. We're actively discouraged from wondering, in fact — that's why no one gets a byline or author credit on Wikipedia pages. It's supposed to be a crowdsourced document.

But whenever someone digs into the actual composition of active editors on Wikipedia, some disturbing facts come out. The Atlantic dug into the data and found that a huge proportion of Wikipedia's editors are white men. (Wired estimates that as many as 90 percent are men, in fact.) Worse, Wired also reports that many of the editors who aren't white or male report frequent harassment, including doxing, death threats, and other alarming actions. But Wikipedia is limited in its responses because of its volunteer nature and opaque culture.

The site is clearly aware of these problems. The Verge reports that it has begun drafting new policies to counter some of the worst behavior. This includes, for the first time, a more formal and regulated process for discussion moderation to protect its editors from being abused and threatened when they're just arguing for an edit.

Wikipedia launched its own social network

Few of us can remember a time before social media. It can seem like we've been tweeting and posting to Facebook since forever, but social media is a remarkably recent development – Twitter only launched 14 years ago, Facebook only 16 years ago. The fact that we can't envision a life without it just shows how successful it is as a concept.

So it shouldn't be surprising that, as the Financial Times reports, Wikipedia co-founder Jimmy Wales launched his own social media platform in 2019: WT:Social. Nor should it be a surprise that WT:Social is intended to be a more trustworthy and reliable alternative to other social media. Wales has stated that he hopes the new platform will help to fight disinformation and "clickbait" that is so easily passed around on other social media networks. Wales hopes that by relying on donations, like Wikipedia does, WT:Social can avoid the "low-quality content" that dominates other social platforms.

Sadly, WT:Social hasn't taken the world by storm. Shortly after launch, it only had 50,000 members (compared to Facebook's 2 billion), and its Alexa rank is well below 100,000 as of this writing. Since there have been a lot of attempts to supplant Facebook and Twitter (anyone remember Ello?), chances are that WT:Social won't be the next Wikipedia.

A lot of Wikipedia is created by artificial intelligence

When you think of the people who write and edit Wikipedia pages, you might think of weirdly obsessed superfans, or bored academics, or social justice warriors with an ax to grind — but you definitely think of people. One of Wikipedia's most notable attributes is that it is crowdsourced, and anyone can write or edit an article.

As Science Daily reports, the word "anyone" can and does include nonhumans. Artificial Intelligence in the form of "bots" (computer algorithms) play a huge role in monitoring pages, correcting mistakes, and keeping content up-to-date. In fact, there are more than 1,600 of these bots toiling constantly.

As Digital Trends explains, these bots perform essential functions, most notably defending the site against vandalism. With more than 180 edits being performed every minute, fighting vandalism is simply too huge a task for human beings, so bots handle a great deal of the dirty work that keeps Wikipedia reliable and functioning. Bots do other stuff, too, like creating redirect pages, checking and fixing broken links, regularly updating statistics, and even identifying and reporting policy violations in articles. In other words, while we think of Wikipedia as a bastion of human knowledge, it wouldn't be possible without some artificial intelligence.