The debate about Wikipedia illustrates the coming obsolescence of Wikipedia

A screen capture of the "Talk" page on Wikipedia's acupuncture article. Yellow boxes show links to 35 archive pages, a contentious topics warning, and an arbitration on pseudoscience note.
A whole lotta talk.

A quick preface on current events:

It is alarming to see members of Congress demanding the identities of Wikipedia administrators, and the acting U.S. Attorney for Washington, D.C. threatening the foundation's nonprofit status. The danger is that political actors are trying to replace an independent institution with one that validates their own preferred narratives. These actions make me, like many others, want to grab my axe and bow to defend the project. Wikipedia’s independence is crucial and must be defended. But defending it honestly requires acknowledging that the story of its trustworthiness is more complicated.
This political fight, like so many online debates, generates heat without addressing core problems. Wikipedia's long-running structural dysfunction has created the very vulnerabilities now being exploited. If we want the project of open knowledge to survive – and we should – these are problems that must be solved. What follows is offered in that spirit. (And includes the story of how I got banned from editing the acupuncture article for asking why the WHO, NIH, and Cochrane are considered unreliable sources.)
Ready? Ok.

Wikipedia is an amazing feat of human engineering and collaboration. It isn't just the assembly of knowledge, which is enormous, but the method by which it is assembled. Wikipedia's official policy documentation is estimated at 150,000 words (this does not include guidelines, which extend to millions of words; the US Constitution is 4,400 words). With 50.1 million registered accounts generating over 5 billion words across 7 million articles, it is among the largest, if not the largest, single human information project in history.

But such a massive undertaking is not without its serious flaws, and it is, at the end of the day, an online community of volunteers – and one that has been under considerable stress for some time. Right now the debate is about AI – and Wikipedia collectively is very upset about AI – but the project has been ruminating about losing editors since 2011. The way that the current controversies and arguments about Wikipedia have been forcing a narrative into binary comparisons is an artifact of the attention economy – an emergent mechanical dynamic that is threatening not just Wikipedia, but democracies around the world.

Readers here should be unsurprised that the mechanics that produce this emergent effect are what interest me the most. Jimmy Wales storming out of an interview over the question of whether he founded or co-founded Wikipedia is a horrible, breathtaking illustration of the very problem that the next phase of the internet needs to fix. There are questions that cannot be answered with a soundbyte, and cannot sufficiently be answered from one perspective. 

I realized this was sort of breaking into a higher level of attention when Hank Green devoted a video to it. Mostly he falls into the same trap of making the question “is Wikipedia good or bad?” and its corollaries “is wikipedia woke?” and “is Jimmy Wales good or bad?”, but he uses one phrase that is at the core of our present crisis of information: “Enrolled in the war over who gets to define reality”. 

The project of Wikipedia is precisely this, and if we turn this “war over who gets to define reality” into a political polarized left and right problem, we will lose the larger war to create the next generation of critical thinking that will be required for the creation of truth in the information age.

Is Wikipedia trustworthy?

This is a question that, considered honestly, does not have a binary yes or no answer. Like most complex questions, it has necessary follow-on questions: Which Wikipedia? When? On what topics? Trustworthy to whom? 

A lot of people are arguing over the binary point of whether Wikipedia has a neutral or left wing bias. Wikipedia’s slight left wing bias has been studied extensively and recently reconfirmed. Its other biases – cultural bias (76% of activity is on English Wikipedia, where as many as 89% of editors are white), gender bias (Wikipedia itself reports that as many as 90% of Wikipedia editors identify as male) – have also been well-documented and remain insufficiently addressed despite years of work by the community. In my mind, its perhaps most problematic bias is socioeconomic, and this one cuts across all the others.

The problem is that there is one Wikipedia, when there is not one definition of truth. Assertions come in more than one category. Some facts are simple (what day was Hitler born?), some are not. To navigate this, what we need is not one Wikipedia but a multitude of interoperable knowledge systems – structures that can accommodate perspective, stakeholder interest, and nuance without collapsing into relativism. The technology for this exists; the will to build it is the question.

In a recent Linkedin post I left the following comment on the earlier bubbling topic of Elon Musk’s war on Wikipedia, and here I’m mostly digging deeper:

Like so many things, AI is accelerating the fracture of something that was already weakening. Wikipedia in many ways IS a crown jewel of human achievement, but its governance process was never fully perfected. It's highly vulnerable to very online vigilante groups that take up certain perspectives and are able to doggedly enforce heterodox views on certain topics. These problems are openly discussed in Wikipedia online communities. 
Fundamentally the internet itself has matured to a point where we need new models and structures to accommodate a definition of truth that goes beyond binary fact and counterfact; a system that can account for perspective, stakeholders, nuance. There is no one view that has a monopoly on truth. But our information models have not been updated to account for this. It aligns with web3 -- decentralization, distribution. 
Tldr Wikipedia is great for simple things and not structured to represent complex ones.

Let’s talk about acupuncture

I may at some point devote an entire essay to this topic, because it was one of my wildest rodeos on the internet (which if you know my history is saying something), but I want to give the high level summary of the situation here just because it was my entry into the vast crevasse that is the topic of Wikipedia’s credibility.

The shortest summary is that I am presently banned from contributing to Wikipedia’s acupuncture article. Because of Wikipedia’s transparency, if you want to read about the whole bizarre and sordid process, you can do so here. My full interaction on acupuncture is here.

I was banned because a senior editor camping the acupuncture page reported me for trolling when I left comments on the talk page about adding reliable sources. That editor in turn found himself sanctioned across multiple topics for his behavior toward me, but the editors then decided that it wouldn’t be fair if only one of us got banned, so we were both banned. (The symmetry of this outcome might seem fair in the abstract, but it misses the point: the article remains unchanged. The process nominally worked -- sanctions were issued, procedures followed -- yet the disinformation persists. This is the deeper problem: Wikipedia's dispute resolution system can adjudicate interpersonal conflicts while leaving the underlying content untouched.) I received side channel messages from admins about how lucky I was that I had randomly rolled arbiters who would not reflexively take the side of the senior editor on principle.

The whole thing started at a dinner conversation with my family in which my son asked “what’s acupuncture?” and I asked our Google assistant, because I am constantly testing these things and I was curious what it would say.  The assistant gave the top paragraph from the Wikipedia acupuncture page, which calls acupuncture a “pseudoscience” and “quackery”. I was livid. I had absolutely no idea of the long history with the acupuncture article; I’ve now gone deep enough into Wikipedia culture that I’m on the discord, where there are archives of people casually mentioning how untouchable the acupuncture article is because of the cabal of pseudoskeptic editors that camp it. There are about half a dozen usual suspects who engage in bullying on the talk page on the regular (the one who came after me has been mentioned in Wikipedia misinformation blog posts before).

The episode illustrates the deep problems with mass-scale volunteer crowdsourced information -- and the stakes are high, because Google treats Wikipedia as its most reliable source. When Wikipedia gets something wrong, the error propagates across the information ecosystem.

I recommend you do not do this, but if you dig into the details of the acupuncture article’s sourcing, you will find that there is a kind of cult around a blogger named David Henry Gorski who has created an attention following based on lambasting other scientists. Despite lacking any credential in the field, Gorski is cited five times in the acupuncture article. If you ask why sources such as the WHO, NIH, ACP, ACR, CMS, Cochrane, Johns Hopkins, Harvard, and Scientific Reports – all of which are used as MEDRS-qualified reliable sources on other medical articles and have published positive studies on the application of acupuncture – are unacceptable, you will receive a dizzying conspiracy theory about the capture of those organizations by the “acupuncture lobby”, which as far as I can find does not exist. They also get very, very angry if you ask why there are no editors working on the page who are Chinese. (I was told that I was the one who was racist for asking that question, if you need a measure of where things stand with regard to Wikipedia’s progress on cultural bias. But also you’re not allowed to say that things are racist.)

A collection of conspiracy theorists who happen to be radicalized under a scientific skeptic identity controlling pages like acupuncture develops real consequences when we consider Wikipedia to be a source of truth. This is personal for me – which is problematic for my editing Wikipedia, and this is another flaw in its policies (stakeholders are specifically excluded) – because I have experienced profound health consequences due to struggling with chronic pain for many years. I resisted acupuncture because I understood science to be opposed to it, and once I finally undertook treatment this year as a kind of last resort, found the pain relief it yields to be superior to anything else I have tried so far. I happen to have the specific constellation of autoimmune and neurological conditions that 1) western science is far behind on, and 2) acupuncture is specifically now shown to be helpful with. And, spoiler alert, these conditions impact women over men at a ratio of 5:1, running us straight into western science’s now-famous Wikipedia-parallel bias against treating conditions that impact women, and especially impact the treatment of women’s pain.

But let’s set aside the years of pain I endured – the decline in function, the days I missed playing with my kids – and might have avoided had Wikipedia not been peddling disinformation. The specific topic of acupuncture is more dangerous than this. The reason why western science is now coming around on acupuncture is that it has been shown to reduce the use of opiates. It also reduces methadone cravings to the extent that it is now used specifically in opiate addiction treatment. In a country where 54,743 people died of opiate overdose in 2024 (and in Europe, where many English Wikipedian editors reside, whose more modest opioid deaths are on the rise), broader acceptance of acupuncture as a pain treatment could save lives. 

On this very specific topic, foundation models are more accurate than Wikipedia (I will die on this hill), which is similarly fascinating to me. In this case, the language model can provide a lengthy, nuanced answer that a soundbyte fact cannot. Please do try this yourself: ask any major chatbot “is acupuncture a pseudoscience?” and tell me whether you think the bot’s answer is more accurate than Wikipedia’s lede.

In the course of investigating this, I was told by numerous administrators of Wikipedia that in order to fix it I would literally have to spend years building political credibility on Wikipedia in order to lead a campaign that could alter it. I would rather one of the good folks at Wired take this on, because I think that the effort required to correct something that the same administrators agree is wrong is the bigger story here. I also prefer this example because the topic of acupuncture sits uncomfortably if analyzed for left or right bias, and illustrates the way that approaching Wikipedia’s bias from a political angle is deeply counterproductive.

The half dozen editors brigading the acupuncture page seem to see themselves as embattled soldiers defending an Alamo. For them, the pages and pages of argument they endure regularly (acupuncture's talk page has an astronomical word count) they see as defending science itself from the storm of ignorance. As science moves under their feet, they cling to fringe skeptic sources and enforce their perspective by winning the engagement metric that fundamentally powers Wikipedia’s governance. As a chronically ill parent, I am excluded from this process because of the harm it would do to me and my family to devote my limited energy to grinding Wikipedia’s bureaucracy. Cursory search will show that subject matter experts are frequently excluded from contributing to Wikipedia for similar reasons.

The painful problem is that the attention economy itself threatens to make the science community brittle. When people who clothe themselves in science behave in a defensive, cruel way, it undermines science and accelerates loss of trust in it. And in a governance sense, when a small group of people are able to exploit a system to break its fundamental contract – in this case, Wikipedia's reliability – it inevitably snaps public trust. Acupuncture is one example, but it's a symptom of a deep flaw in Wikipedia that has gone unaddressed for decades, and long precedes AI.

Is Grokipedia a really bad idea?

Yes it is. This is not really even worth spending my attention or yours on. But Grokipedia is the inevitable result of continuing to let these problems go unsolved.

What now?

We cannot treat Wikipedia’s ‘bias’ problem as a singular issue. When we reduce it to “wikipedia is biased” (conservative view) and “wikipedia is fine” (liberal view), we create an endless fight that shouts past stakeholders and convinces no one.Wikipedia has dozens or hundreds of articles on contested topics where small groups of entrenched editors have calcified content around positions that diverge from broader expert consensus or reasonable interpretation. Each of these articles has its own community of frustrated people who know from direct experience that Wikipedia is wrong on this specific thing.

  • Acupuncture practitioners and chronic pain patients who know the article misrepresents current research and functional medical utility
  • Historians who know a particular article enshrines an outdated interpretation
  • Scientists whose field has moved past what Wikipedia's sources reflect
  • People from non-Western cultures who see their traditions systematically mischaracterized
  • Subject matter experts in countless fields who've tried to correct errors and been rebuffed

Each community's complaint is specific and legitimate. But Wikipedia's defenders dismiss them as "people who don't like what the sources say" or "POV pushers," because that's often true of other complainants. And each community, feeling dismissed, becomes more convinced that Wikipedia is fundamentally broken. Visit these communities online and see what I’m talking about. If you love Wikipedia, it will break your heart.

But it’s worse, and more dangerous, than that. Now aggregate all these communities. That's a lot of people with direct, personal experience of Wikipedia being wrong and being unable to fix it. When a Congressional investigation says "Wikipedia is biased and unaccountable," these people don't hear partisan overreach. They hear someone finally listening.

The tragedy is that the Congressional investigation isn't actually about their concerns. The acupuncture article isn't going to get fixed because James Comer sends a letter. The investigation is about political control, not editorial accuracy. But it sounds like it's about the thing they've experienced, so they're sympathetic to it… or at least not motivated to defend Wikipedia against it.

Something better

The higher quality of foundation scale LLM answers about acupuncture should act as a provocation to correct broken human systems. Properly used AI is a tool for insight; a mirror. But fixing human systems is delicate, difficult work. Ultimately, the acupuncture lede is evidence of a small group of people exploiting complex bugs in Wikipedia's bureaucratic code. 

I don't think that Wikipedia is ignoring these weaknesses – but I do think that the community focuses on information problems without understanding that the information problems arise from governance problems. And defenders of Wikipedia turning the debate into “science good internet bad” mask these problems to Wikipedia's increasingly existential detriment.

What challenges Wikipedia most is that the scale of the game has radically changed. Today English Wikipedia receives more than 4,000 page views every second and averages over 2 edits per second, with 130 billion page views in 2024. Yet the volunteer workforce has collapsed under this demand: active editors peaked in 2007 at more than 51,000 and by 2013 had fallen by a third, while active administrators dropped from around 1,000 in 2007 to approximately 450 as of August 2025. The median administrator as of March 2024 was promoted in November 2005 -- most were appointed nearly two decades ago. 

This shrinking corps of volunteers has increasingly relied on automated tools to manage the workload – tools that also make it even more difficult for newcomers to onramp. Retention rates for productive new editors crashed from 40% to 13% between 2007 and 2008. The result is a vicious cycle: overworked administrators and entrenched editors become dependent on each other, unable to enforce neutrality standards against the contributors whose vigilance keeps the site running. Articles on contested topics calcify around the preferences of their most persistent defenders rather than reflecting genuine scholarly consensus.

When I learned all this, the behavior of the admins in arbcom made more sense to me, as did the intractability of fixing the acupuncture article. At this point administrators are held hostage to engaged, experienced editors, whose vigilance has become more valuable than their adherence to the site's own content standards. Put simply, there are just too many other fish to fry. As it happens, the same editor that got sanctioned in my arbcom review went on to violate his sanction four months later on the same article. Per Wikipedia policy, having observed this, I should report it, but not only do I vehemently not desire further karmic entanglement with this person, I actually do feel bad calling cops who are so obviously overwhelmed.

The macro problem with this is that these dynamics erode Wikipedia's trustworthiness while they simultaneously drive away new editors. The gap between Wikipedia's stated ideals and its operational reality is immediately visible to any newcomer who tries to participate in good faith. The rational response is to leave, which is precisely what the data shows is happening -- and the people who stay are disproportionately those willing to fight, further entrenching the adversarial culture that drives everyone else away.

These are familiar dynamics for designers of massively multiplayer online games, or community designers for online populations of any significant size. They are difficult, but not impossible, to fix. Graduated onboarding, reputation systems that decay over time, distributed moderation, structured mentorship -- these are solved problems in community design, even at Wikipedia's scale. The dysfunction is a choice, not an inevitability.

There is a core of Wikipedia editors who are genuinely the best of people: empathetic and sharp-minded lovers of knowledge and wisdom who give enormously of themselves for a project with no prospect of financial gain or fame. But they are under tremendous pressure. I also believe that organizations become manifestations of their leadership. When Wales storms out of an interview over a question he claims not to care about, it's hard not to see a leader overwhelmed by pressure from all sides – and an institution that may lack the capacity to navigate what's coming.

It seems likely to me that our current internet requires a new kind of approach to Wikipedia entirely. It requires a multitude of interoperable Wikipedias that can be created effortlessly enough as to be accessible to people of diverse backgrounds. It requires new tools for governance and public support to pay for the maintenance of this infrastructure. It is the project of several lifetimes. It is worth it.

Friends know that I bring up Ruthanna Emrys’s A Half-Built Garden with obnoxious frequency. I tend to agree with Ruthanna’s vision in that book that AI is more likely to be a part of this new future of information than not. Wielded well, it can empower human creators in a way that achieves information equity. Not by providing us solutions, but by enabling us to connect with each other more effectively, and to spend our time with the greatest impact.

Our current moment feels dark because old systems are failing and new ones haven’t been built yet. Wikipedia's dysfunction is evidence that we've outgrown the first-generation model of collaborative knowledge production – but the failure also creates an opening. We now know what doesn't scale: centralized governance, unpaid moderation at planetary scope, bureaucratic systems that reward persistence over accuracy. The question is whether we can learn from what went wrong and build something better. The alternative is to retreat into walled gardens and algorithmic feeds that optimize for engagement over truth. The golden age of information isn't behind us. It hasn't happened yet.

Read more