At the beginning of this era, the Internet underwent a profound transformation. The name web 2.0 led to the transition of passive reading — static pages, closed portals and content created for a short time — to an ecosystem based on participation, collaboration and exchange. Platforms on pages, users on audiences. In this context, on January 15, 2001, a website appeared with the first announced sentence, very easily: “This is the new Wikipedia!”
Veinticinco Years ago, Wikipedia turned into one of them the most influential social experiments in digital history. Not because it is perfect, but precisely because it reveals its functionality, its conflicts and its limitations. That’s all about imperfection. Behind each seemingly neutral input lies a complete choreography of human emotions: volunteers evaluating sources, discussing concerns, correcting ambiguities, and enforcing standards such as verifiability or neutrality. The encyclopedia is not presented as revealed truth, but rather as an open, collaborative process. A meeting once in a while.
The classic episode of Jimmy Wales attempting to write a carnival entry in Gugulethu, South Africa, illustrates this model well. Their editions were reviewed by other publishers who emphasized the importance of trade. The end result was not an imposition of the funder’s authority, but a note evidencing the cancellation. In Wikipedia, including the exercise of power, including its founder, if you participate in a public debate.
This transparency is one of its greatest strengths. Reading a post’s review history can be more informative than the final text. It is no accident that academics have used these histories as investigative material to analyze scientific controversies – such as the development of the Crispr gene editing technology – or complex political processes such as the Egyptian revolution of 2011. Wikipedia is not just knowledge of the article: document how it is built.
The scale of this human effort is easy to underestimate. In 2024, according to Statista, 4,400 million people visited the site. That would be more than the world population average. More than 125,000 people edited at least one post once in a while. All this without publicity, without personal data and with a foundation — Wikimedia — which is supported by donations and some paid services for massive reuse of content, including those used to train artificial intelligence systems.
In an Internet dominated by platforms that monetize participation through viewership, Wikipedia appears to be a functional remnant of Web 2.0’s original ideals. However, this model is aimed at a different purpose: accelerated expansion of artificial generative intelligence.
Comparison is inevitable with Grokipedia, an AI-powered encyclopedia associated with Elon Musklaunched in the final of 2025. Within a few months, it accumulated more than 5.6 million entries and reached the volume of Wikipedia. Many of them are identical cases to the originals, which is allowed by open licenses, although not exempt from controversy. Unlike Wikipedia, these articles are not edited directly: users can only suggest corrections for the AI to “consider”.
The question is troubling: if a machine-generated encyclopedia can match or surpass the scale of one built by volunteers over decades, we are facing the end of a web collaborative project or before a new adaptation?
The relationship between artificial intelligence and the human-made internet has always been ambiguous. Shared knowledge is voluntarily converted into raw material for models trained with broad consensus and limited attribution. Again, these models generate text that contains the color red, without fixed resources: what many editors call “AI-based”.
Wikipedia is not ready for this problem. Volunteers are exposed every time more plausible but false quotes, general phrases and non-existent AI-generated links. The response has helped strengthen community oversight, with initiatives like WikiProject AI Cleanup offering steps to identify and remediate this type of content. Beyond banning AI altogether, Jimmy Wales defines possible synergy: strategies that help, for example, to include non-English-speaking editors or reduce the persistent gender imbalance between newcomers and collaborators.
PUSH Wikipedia’s credibility has always been low. Cases such as the fraud in the biography of John Seigenthaler Sr. or the scandal of editor Essjay and his fabricated credentials demonstrated his vulnerability. They have also exposed paid conflicts of interest, such as the Wiki-PR case of 2012. These complaints fuel repeated accusations of ideological bias, today amplified by figures like Musk, who present Grokipedia as a necessary “purgatory” of once-dominant propaganda.
After 25 years, Wikipedia seems to be facing a new version of the “tragedy of the commons”: volunteered knowledge supports systems that can in some cases degrade the quality of the information ecosystem. The key difference an explanation of the items follows. Wikipedia offers public histories, visible debates, and shared rules. Generative AI, on the other hand, operates primarily with specific models that imitate knowledge without showing its process. If search engines start favoring these automatic displays, information habits can change quickly.
Wikipedia and AI are reshaping the flow of knowledge with proper logic. Documented Human Justice Interview; the second in massive production of true text. Now the elections are still collective. And now you can find the most important lesson in Wikipedia.

Leave a Reply