DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

The Culture Wars Came for Wikipedia. Jimmy Wales Is Staying the Course.

October 18, 2025
in News
Jimmy Wales Thinks the World Should Be More Like Wikipedia
495
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

As one of the most popular websites in the world, Wikipedia helps define our common understanding of just about everything. It’s the closest thing the internet has to a public utility. Founded in 2001 by Larry Sanger and Jimmy Wales, Wikipedia has always operated as a nonprofit with a decentralized system of editing by mostly anonymous volunteers. There are rules for how people should engage on the site (cordially) and how changes are made (transparently). That has led some to call Wikipedia “the last best place on the internet.”

But recently, the site has become a favorite target of Elon Musk, congressional Republicans and right-wing influencers, who all claim that Wikipedia is biased. (Sanger now says the same.) In many ways, this tension is a microcosm of broader conversations we’re having about consensus, civility, shared reality, truth and facts.

Now Jimmy Wales has written a book titled “The Seven Rules of Trust: A Blueprint for Building Things That Last,” which will be published this month. In it, Wales tries to apply the lessons of Wikipedia’s success to our increasingly partisan, trust-depleted world. I talked to Wales about what, in his view, makes Wikipedia great, the various threats it’s facing and his persistent belief that most people are acting in good faith. And a note: Wales and I spoke several weeks before a man with a gun walked up on the stage at a Wikipedia conference in Manhattan on Friday. You can read more about that incident here.

Subscribe: Apple Podcasts | Spotify | YouTube | Amazon | iHeart

This is a very tenuous moment for trust, and your new book is all about that. Big picture: How would you describe our current trust deficit? I draw a distinction between what’s going on with politics, journalism, the culture wars and all of that, and day-to-day life. In day-to-day life, people still do trust each other. People generally think most people are basically nice and we’re all human beings bumping along on the planet trying to do our best. But the crisis we see in politics — trust in politicians, trust in journalism, trust in business — is coming from other places and is something that we can fix.

One reason you can be an authority on this is that you created something that scores very high on trust. Wikipedia isn’t as good as I want it to be. And that’s part of why people do have a certain amount of trust for us, because we try to be really transparent. You see the notice at the top of a page sometimes that says, “The neutrality of this page has been disputed,” or “The following section doesn’t cite any sources.” People like that. Not many places these days will tell you, Hey, we’re not so sure here.

Wikipedia is famously open-source. It’s decentralized and essentially run by thousands of volunteer editors. You don’t run Wikipedia. It runs me. [Laughs]

How do those editors fix disputes when they don’t agree? Take a controversial issue like abortion. We can report on the dispute. So, rather than trying to say abortion is a sin or abortion is a human right, you could say, “The Catholic Church position is this, and critics have responded thusly.” I believe that that’s what a reader really wants. They don’t want one side of the story. They want to understand what people are arguing about. They want to understand both sides.

Every page has what’s called a talk tab, where you can see the history of the discussions and the disputes, which relates to another principle of the site: transparency. Yeah, exactly. Often, you’ll be able to go on the talk page and read what the debate was, and you can weigh in and say: “Oh, actually, I still think you’ve got it wrong. Here are some more sources, here’s some more information.” Maybe propose a compromise. And in my experience, it turns out that a lot of pretty ideological people on either side are actually more comfortable doing that because they feel confident in their beliefs. I think it’s the people — you’ll find lots of them on Twitter [X], for example — who are not that confident in their own values and their own belief system, who feel fear or panic or anger if someone’s disagreeing with them, rather than saying: “Huh, that’s different from what I think. Let me explain my position.”

One of Wikipedia’s superpowers can also be a vulnerability. Human editors can be threatened, even though they’re supposed to be anonymous. You’ve had editors doxxed, pressured by governments to doctor information. Some have had to flee their home countries. I’m thinking of what’s happened in Russia, India, where those governments have really taken aim at Wikipedia. Would you say this is an expanding problem? Yeah, I would. We are seeing all around the world a rise of authoritarian impulses toward censorship, toward controlling information, and very often these come as a wolf in sheep’s clothing, because it’s all about “protecting the children” or whatever it might be. But at the same time, Wikipedians are very resilient and very brave. In many cases, what’s happened is a real lack of understanding by politicians and leaders of how Wikipedia works. A lot of people really have a very odd assumption that it’s somehow controlled by the Wikimedia Foundation, which is the charity that I set up that owns and operates the website. And therefore they think it’s possible to pressure us. The community has real intellectual independence. But yeah, I do worry about it. Something that weighs very heavily on us is volunteers who are in dangerous circumstances, and how they remain safe is critically important.

I want to bring up something that just happened here in the United States. In August, James Comer and Nancy Mace, two Republican representatives from the House oversight committee, wrote a letter to Wikimedia requesting records, communication, analysis on specific editors and any reviews on bias regarding the state of Israel, in particular. They are, and I’m going to quote here, “investigating the efforts of foreign operations and individuals at academic institutions subsidized by U.S. taxpayer dollars to influence U.S. public opinion.” Can you tell me your reaction to that query? We’ve given a response to the parts of it that were reasonable. We feel like there’s a deep misunderstanding or lack of understanding about how Wikipedia works. And ultimately, the idea that something being biased is a proper and fit subject for a congressional investigation is frankly absurd. In terms of asking questions about cloak-and-dagger whatever, we’re not going to have anything useful to tell them. I know the Wikipedians. They’re a bunch of nice geeks.

The Heritage Foundation, the architect of Project 2025, has said that it wants to dox your editors. How do you protect people from that? It’s embarrassing for the Heritage Foundation. I remember when they were intellectually respectable.

But it does seem as if there is this movement on the right to target Wikipedia, and I’m wondering why you think that’s happening. It’s hard to say. Some of it would be genuine concern, if they see that maybe Wikipedia is biased. For example, Elon Musk has said Wikipedia is biased because of really strong rules about only citing mainstream media, and the mainstream media is biased. OK, that’s an interesting criticism worthy of some reflection by everyone, the media and so on. Then, in various places around the world, not speaking just of the U.S., facts are threatening. And if you and your policies are at odds with the facts, then you may find it very uncomfortable for people to simply explain the facts. But we’re not about to say: “Gee, you know, maybe science isn’t valid after all. Maybe the Covid vaccine killed half the population.” No, it didn’t. That’s crazy, and we’re not going to print that. They’re going to have to get over it.

I want to talk about a recent example of a controversy surrounding Wikipedia, and that’s the assassination of Charlie Kirk. Senator Mike Lee called Wikipedia “wicked” because of the way it had described Kirk on its page as a far-right conspiracy theorist. I went to look, and at the time that we’re speaking, that description is now gone. Those on the left would say that description was accurate. Those on the right would say it was biased. How do you see that tension? The correct answer is you have to address all of that. The least controversial thing you could say about Charlie Kirk is that he was controversial. I don’t think anybody would dispute that. And to say, OK, this is a figure who was a great hero to many people and treated as a demon by others. He had these views, many of which are out of step with, say, mainstream scientific thinking, many of which are very much in step with religious thinking — those are the kinds of things that, if we do our job well, which I think we have in this case, we’re going to describe all of that. Maybe you don’t know anything about Charlie Kirk, you just heard: Oh, my god, this man was assassinated. Who was he? What’s this all about? Well, you should come and learn. You should learn who his supporters were and why they supported him and what are the arguments he put forward and what are the things he said that upset people. That’s just part of learning what the world is about.

So those words that were there — “far right” and “conspiracy theorist” — those were, in your view, the wrong words, and the critics of Wikipedia had a point? Well, it depends on the specific criticism. If the criticism is that this word appeared on this page for 17 minutes — you’ve got to understand how Wikipedia works. It’s a process, it’s a discourse, it’s a dialogue. But to the extent that he was called a conspiracy theorist by prominent people, that’s part of his history. Wikipedia shouldn’t necessarily call him that, but we should definitely document all of that.

You mentioned Elon Musk, who has come after Wikipedia. He calls it Wokepedia. He’s now trying to start his own version of Wikipedia called Grokipedia. And he says it’s going to strip out ideological bias. I wonder what you think attacks like his do for people’s trust in your platform. Because as we’ve seen in the journalism space, if enough outside actors are telling people not to trust something, they won’t. It’s very hard to say. For many people, their level of trust in Elon Musk is extremely low because he says wild things all the time. When he attacks us, people donate more money. That’s not my favorite way of raising money, but the truth is, a lot of people are responding very negatively to that behavior. One of the things I do say in the book, and I’ve said to Elon Musk, is that type of attack is counterproductive even if you agree with Elon Musk. Because to the extent that he has convinced people falsely that Wikipedia has been taken over by “woke” activists, then two things happen: Your kind and thoughtful conservatives — who we very much welcome and we want more of — if those people think, “Oh, no, it’s just going to be a bunch of crazy woke activists,” they’re going to go away. And then on the other side, the crazy woke activist is going to be like: “Great, I found my home. I can come and write rants against the things I hate in the world.” We don’t really want them, either.

You said you talked to Elon Musk about this. When did you talk to him about it, and what was that conversation like? We’ve had various conversations over the years. He texts me sometimes. I text him sometimes. He’s much more respectful and quiet in private. But that you would expect. He’s got a big public persona.

When was the last time you had that exchange? That’s a good question. I don’t know. I think the morning after the last election. He texted me that morning. I congratulated him.

The debate that happened more recently was because of the hand gesture he made that was interpreted in different ways, and he was upset about how it had been characterized on Wikipedia. I heard from him after that. In that case, I pushed back because I went to check what Wikipedia said. And it was very matter-of-fact. It said he made this gesture, it got a lot of news coverage, many interpreted it as this and he denied that it was a Nazi salute. I don’t see how you could be upset about it being presented in that way. If Wikipedia said, “Elon Musk is a Nazi,” that would be really, really wrong. But to say, “Look, he did this gesture and it created a lot of attention, and some people said it looked like a Nazi salute?” That’s great. That’s what Wikipedia should do.

Do you think Elon Musk is acting in good faith? You’re saying that in private he’s nice and cordial, but his public persona is very different. I think it’s a fool’s errand to try and figure out what’s going on in Elon Musk’s mind, so I’m not going to try.

I don’t mean to press you on this, but I’m just trying to refer to something that you said, which is that people, human to human, are nice, that we should assume good faith. And so you’re saying that Elon, one on one, is lovely. But he is attacking your institution and potentially draining support for Wikipedia. I don’t think he has the power he thinks he has, or that a lot of people think he has, to damage Wikipedia. We’ll be here in a hundred years and he won’t. As long as we stay Wikipedia, people will still love us. All the noise in the world and all these people ranting, that’s not the real thing. The real thing is genuine human knowledge, genuine discourse, genuinely grappling with the difficult issues of our day. That’s actually super-valuable. So I hope Elon will take another look and change his mind. That’d be great. And in the meantime, I don’t think we need to obsess over it.

Why do you think the internet didn’t go the way of Wikipedia — collegial, working for the greater good, fun, nerdy? I’m old enough that I grew up on the internet in the age of Usenet, which was this massive message board — kind of like Reddit today, except not controlled by anyone, because it was by design distributed and uncontrollable, un-moderateable for the most part. And it was notoriously toxic. So I think some of these things are just human issues. But now we live online, so obviously the impact is much more.

You chose at a certain point to make Wikipedia a nonprofit. You chose not to capitalize on the success of Wikipedia. OpenAI started as an “open source for the greater good” project, kind of like Wikipedia. And they’ve now shifted into being a multibillion-dollar business. I’d love to know your thoughts on that shift for OpenAI, but more broadly, do you think the money really changed the equation? I do think it made a difference in lots of ways. There’s nothing wrong with for-profit companies, but even as a nonprofit, you have to have a business model, you’ve got to figure out how you’re going to pay the bills. For Wikipedia, that’s not too bad. We don’t require billions and billions of dollars in order to operate. In terms of the development of Wikipedia and how we’re so community-driven, you wouldn’t necessarily have that if the board were made up of investors who were worried about the profitability. The most successful tweet I ever had — I think it was a New York Post journalist tweeted to Elon, “You should just buy Wikipedia,” when he was complaining, and I just wrote, “Not for sale.” That was very popular, but it isn’t for sale. I would like to imagine myself as the person who would say to Elon, “No thank you for a $30 billion offer,” if I owned the whole thing. But would I actually? And so that’s not going to happen because we’re a charity and I don’t get paid and the board doesn’t get paid, and I do think that’s important for that independence. We don’t think in those terms. We’re not even interested in that.

The co-founder of Wikipedia, Larry Sanger, gave an interview to Tucker Carlson that’s getting a lot of attention on the right. In the past he’s called Wikipedia “one of the most effective organs of establishment propaganda in history,” and he believes Wikipedia has a liberal bias. In this interview, and on his X feed, he’s advocating what he’s calling reforms to the site, which include “reveal who Wikipedia’s leaders are” and “abolish source blacklists.” I just wonder what you make of it. Haven’t watched it. I can’t bear Tucker Carlson. So I can’t speak to the specifics, but the idea that everything is an equally valid source and that it’s somehow wrong that Wikipedia tries to prioritize the mainstream media and quality newspapers and magazines and make judgments about that is not something I can apologize for. One of my fundamental beliefs is that Wikipedia should always stand ready to accept criticism and change, and so to the extent that a criticism says Wikipedia is biased in a certain way and that these are the flaws in the system, well, we should take that seriously. We should say, “OK, is there a way to improve Wikipedia? Is our mix of editors right?” At the same time, we’re designing everything for the long haul, and the only way we can last that long is not by pandering to this raging mob of the moment but by maintaining our values, maintaining our trustworthiness. We’re just going to do our thing, and we’re going to do it as well as we can. I don’t know what else we can do.

This interview has been edited and condensed from two conversations. Listen to and follow “The Interview” on Apple Podcasts, Spotify, YouTube, iHeartRadio or Amazon Music.

Director of photography (video): Zackary Canepari

Lulu Garcia-Navarro is a writer and co-host of The Interview, a series focused on interviewing the world’s most fascinating people.

The post The Culture Wars Came for Wikipedia. Jimmy Wales Is Staying the Course. appeared first on New York Times.

Share198Tweet124Share
High schooler inspired by Charlie Kirk’s work motivated to start Turning Point USA chapter
Culture

Wisconsin school to partner with TPUSA after facing backlash over rejecting student’s chapter

by Fox News
October 18, 2025

NEWYou can now listen to Fox News articles! A Wisconsin school has announced plans to form a partnership with Turning ...

Read more
News

Parents set bail for Long Island elementary school teacher accused of sexting girl, 13

October 18, 2025
News

Tennessee factory explosion effect already stretched US weapons production

October 18, 2025
News

40 Days for Life sees spike in volunteers after Charlie Kirk’s assassination

October 18, 2025
News

‘No Kings’ Protesters Flood Over 2500 Cities to Defy Trump

October 18, 2025
U.S. to Repatriate Survivors of Its Strike on Suspected Drug Vessel

U.S. Is Repatriating Survivors of Its Strike on Suspected Drug Vessel

October 18, 2025
3 Pop Stars of Today That Carry the Edge of Your Favorite 90s Grunge Bands

3 Pop Stars of Today That Carry the Edge of Your Favorite 90s Grunge Bands

October 18, 2025
Premier League boss sacked minutes after latest loss, ending brief and winless tenure with Nottingham Forest

Premier League boss sacked minutes after latest loss, ending brief and winless tenure with Nottingham Forest

October 18, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.