Jimmy Wales Doesn’t Think Grokipedia Will Be ‘Anything Like Wikipedia’

Jimmy Wales calls himself a “pathological optimist.” And yet, when the Wikipedia co-founder spoke with TIME in October, he still seemed somewhat surprised that his online encyclopedia was actually working. “Wikipedia is very trusting, although it always seemed a little crazy,” Wales says. When you think about the chaos of social media, Wikipedia's model of allowing anyone to edit any entry seems “completely crazy,” he says.

We're talking about this because Wales just wrote his first book. Seven rules of trustwhich attempts to lay out what Wikipedia and a few other bright corners of the internet (Wales cites Airbnb, Uber and Ebay) can teach us about rebuilding trust in a world filled with skepticism. Since Wikipedia's launch in 2001, trust in politicians, the mainstream media and “to some extent each other” has fallen sharply, with consequences extending beyond political gridlock, Wales said. Wales, 59, was friends with Jo Cox, a member of the British Labor Party who was killed in 2016 by far-right extremists in the days before the Brexit referendum. He believes the rise in politically motivated violence is “the natural result of a sense of complete breakdown of social norms and the idea of ​​trust – of being able to say, 'Look, I don't agree with you, but I believe we can have a dialogue, find compromise and move forward,'” he says. And yet, “Wikipedia has gone from being a joke to one of the few things people trust.”

Recently, however, this crisis of trust has begun to nip at Wikipedia's heels. Billionaire Elon Musk, who was once a big fan of Wikipedia, included the encyclopedia, as did White House AI and cryptocurrency czar David Sachs, conservative commentator Tucker Carlson and even Wales co-founder Larry Sanger, who all argued that Wikipedia is biased.

In October, the day before Wales published his book, Musk released a Wikipedia competitor called Grokipedia, which he said used its artificial intelligence chatbot Grok to create entries. The AI-driven encyclopedia currently has over 885,000 articles, many of which are very similar to their Wikipedia counterparts. Although Grokipedia is dwarfed by Wikipedia, which has more than 7 million English-language articles, Musk, in a post on his social media platform X, said that Grokipedia will surpass Wikipedia by orders of magnitude in breadth, depth and accuracy. Musk has been criticizing Wikipedia for some time, calling it “Wikipedia,” and in 2023 offered to give the platform, which is overseen by the nonprofit Wikimedia Foundation, $1 billion if he could rename it “Dikipedia.” Wales told Bloomberg in October said Musk's accusations of bias were “not true,” adding: “A better way to say it is: if you feel Wikipedia has some bias, encourage people to come and participate—people who agree with you.” Don't paint us as… crazy leftist activists or anything like that. This is wrong”.

Early responses to Grokipedia were divided along familiar lines. Musk's fans praised Grokipedia for being “no human bias and no mistakesand for him “nuances and details” in entries on topics such as the death of George Floyd. The Grokipedia article highlights Floyd's criminal record early on, and his police killing is only mentioned later. Critics, meanwhile, point out that the articles are about Musk and his companies longer than their Wikipedia counterparts but we omit the unflattering details. Unlike Wikipedia, Grokipedia cannot be edited by users directly. They can review sources and make suggestions for corrections, but they are not discussed on public discussion pages or decided by moderators, as they are on Wikipedia. Instead, they are processed by Grok, a version of the same AI chatbot that made anti-Semitic statements after the July update, forcing xAI to apologize and deactivate the update. Wales' reaction to all this? “I don't think we'll see fragmentation of online encyclopedias any time soon. Wikipedia will continue to strive for high quality and neutrality,” he says. “If Elon creates an encyclopedia skewed by his worldview, I'm sure it will get some traffic, but it won't be anything like Wikipedia.”

Wales seems keenly aware of Wikipedia's shortcomings. His book revisits infamous episodes, such as when an online troll used the site to falsely accuse journalist John Seigenthaler of the Kennedy assassinations. Wales writes that governments, activists and ideologues have sought to use the platform's editing tools to promote their worldviews. But the site's continued growth suggests those interests have not overpowered the volunteer army of “Wikipedians,” he says. “The fact that Wikipedia is still huge and more popular than any newspaper is partly because we try very hard—not perfectly, of course—to stick to the facts and be transparent,” Wales says. “You can see where the information came from. You can click on it and check it.”

Wales himself became embroiled in an editing dispute over an article on the site called “Genocide in Gaza” on November 2, writing on the edit discussion page that the article “does not meet our high standards” for using the voice of Wikipedia to say that Israel is committing genocide in Gaza. He called it a “particularly egregious example” of broader site neutrality problems. Wales' comments sparked backlash from some editors. “Why should the opinions of largely impartial UN scientists and human rights activists be equated with the clearly biased opinions of commentators and governments?” asked one commentator. “Because that’s what neutrality requires,” Wales replied. “Our job as Wikipedians is not to take sides in this debate, but to document it carefully and neutrally.” (The Wikimedia Foundation said in a statement that, even as a co-founder, Wales is only “one of hundreds of thousands of editors who strive to report information, including on controversial topics, in accordance with Wikipedia's policies.”)

Grokipedia is not the only threat to Wikipedia posed by artificial intelligence. About 65% of the traffic loading the nonprofit's servers now comes from bots, some of which crawl the site to feed it to chatbots for training. Instead of going to Wikipedia, search engine users can now often find answers in summaries, sometimes flawed, generated by AI. That is unless they go straight to ChatGPT or Claude. Wales says all this means that islands of human-created content such as Wikipedia are “becoming more important than ever.” He says his principles of trust are just as relevant for AI developers, “because every time you get an AI response and find out that the AI ​​was hallucinating and just made it up, it lowers your trust.”

This is where the “real world” comes in. Part of Wales's message is that most of us already practice trust in “very routine ways”, such as sharing a ride or sharing an elevator with strangers. He points to Braver Angels, an American group that facilitates face-to-face conversations between people with opposing policies. Participants often become “a little more understanding… a little more willing to think about trade-offs,” Wales says. The challenge is to create institutions and online spaces that harness these impulses. Wikipedia's collaborative culture is, at best, the web version: slow, structured, and flawed.

And when it comes to interacting online, Wales's best advice is disarmingly simple. Focus your attention on actions that build trust. Audit your channels. “If you find yourself spending too much time on social media and getting information you don't trust, stop doing it,” he says. He offers one specific hint: remove X from your phone.

Leave a Comment