DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Everything Wrong With the Internet and How to Fix It

February 6, 2026
in News
Everything Wrong With the Internet and How to Fix It

This is an edited transcript of “The Ezra Klein Show.” You can listen to the episode wherever you get your podcasts.

What was the last year the internet felt good to you? I think everybody has different answers for this. I go fairly far back — maybe to the heyday of blogging, or at least before the moment when Twitter and Facebook went algorithmic.

Whatever your answer, I have not found many people who think the internet of 2026 — with all of its anger and its outrage and its A.I. slop — is what we were initially promised. There is among us a growing unease that something went wrong with the internet, that it is driving our society somewhere we don’t want it to go. Yet there is not really a consensus about what to do about these giant platforms that are increasingly spammed up with ads and sponsored results, boosting content that will keep us hooked and angry, isolating and dividing us, deranging our politics, making a few billionaires ever richer — all the while held up by an army of low-wage workers in warehouses and on delivery bikes.

It’s clear something has gone so wrong. But what do we do about it?

My guests today have two theories. Cory Doctorow is a longtime blogger, an activist with the Electronic Frontier Foundation and a science fiction writer. His new book is “Enshittification: Why Everything Suddenly Got Worse and What to Do About It.”

Tim Wu worked as a special assistant to President Joe Biden for technology and competition policy. He’s a professor at Columbia Law School and the author of several influential books on technology, including his latest, “The Age of Extraction: How Tech Platforms Conquered the Economy and Threaten Our Future Prosperity.”

Enshitification and extraction, those are the ideas I wanted to put in play together here — and also to think about what solutions they might present.

Ezra Klein: Tim Wu, Cory Doctorow, welcome to the show.

Cory Doctorow: Thank you very much.

Tim Wu: Good to be here.

Klein: So I just learned that you both went to elementary school together? [Chuckles.]

Wu: Yeah, that’s true.

Doctorow: In suburban Toronto, a weird little school with, like, 80 kids. It was kindergarten to eighth grade in one classroom. Older kids taught the younger kids. We more or less were left to go feral and design our own curriculum. They chucked us out of the school on Wednesday afternoons to take our subway pass and find somewhere fun in Toronto to go do stuff. It was great.

Klein: Is there anything about that school that would lead people to become sworn enemies of our tech overlords?

Wu: Well, we loved tech at the time. We were early in on Apple IIs, and frankly, that’s where it all started in a way. Both of our books have this kind of pining for a lost age, and I think some of it is this early era of computing, when we were just so excited, so optimistic, and everything was just going to be so amazing. That, to me, a little bit, was fifth grade — or Grade 5, as we say — programming the Apple II.

Doctorow: Can I slightly problematize that? We were both also science fiction readers back then. So, you know, in 1981, the first William Gibson story had been out for a couple of years. I was pretty alive to the dystopian possibilities of computers at the time, so I wouldn’t call myself optimistic. I would call myself hopeful and excited, but not purely optimistic.

And I would also like to say that, like John Hodgman, “Nostalgia is a toxic impulse.”

When I think about what I like about those days, it’s not that I want to recover them, it’s more that I kind of dispute that the only thing an era in which people had lots of control over their computers could have turned into is one in which the computers had lots of control over them. That there is probably something else that we could have done.

Klein: When you’re spending time on the internet, what feels bad about it to you?

Doctorow: What I would do is contrast what happens when things aren’t great now with how I felt about what happened when things weren’t great before. So I think when I was a larva on the early internet, and I saw things that sucked, I would think: Someone is going to fix this, and maybe it could be me.

Now when I see bad things on the internet, I’m like: This is by design, and it cannot be fixed because you would be violating the rules if you even tried.

Klein: Tim, how about you?

Wu: I feel it’s like a tool I cannot trust. I feel like the tools I like in my life, like a hammer — I swing it, and it does something predictable.

The internet seems like it’s serving two masters. I search for something, I get a bunch of stuff I don’t really want, and I don’t really know what I’m getting. I want to write one email or check one thing. I end up in some strange rabbit hole, and three hours go by, and I don’t know what happened.

So I feel like I’m constantly at risk of being manipulated or taken from, and I don’t trust the tools to do what they say they’re going to do. And I feel that makes using it kind of like living in a fun house.

Klein: So I want to make sure I give voice to somebody who is not in the show at the moment, because this is going to have the flavor of ——

Doctorow: The Prophet Elijah has entered the chat.

Klein: Yeah, yeah. The flavor of three middle-aged guys who think the internet went wrong somewhere along the way. When I was working on this episode with my producer, one of the interesting tensions behind the scenes was that she doesn’t think the internet is bad. She said she thinks TikTok is “a perfect platform.” She has young kids and feels Amazon is a godsend for a young parent.

Obviously, there are many people like this who are using these platforms happily and freely of their own volition. So what do you say to somebody who says: What are you [expletive] all talking about?

Wu: I guess I’ll start. The middle-age thing used to be better, which is: I don’t want to fall into that situation. I just think the deal is not what it could be. And I think that maybe as a consumer who sort of lightly uses this, the internet is still useful.

I have children, too. And I think it’s hard to deny that social media has been tough on kids and has had all kinds of negative effects on them, and that really started accelerating over the last 15 years or so.

I think we have a highly polarized political structure, which is made worse by social media. I think we have a problem with inequality, which has gotten worse and worse, and is accentuated by the fact that the margins are just so thin for independent businesses.

And I also think this vision that it would be this equalizer, leveler — this technology that made a lot of people rich, not just a few people rich, that it was a more or less, not easy, but reasonable and a lucrative thing to do to start your own business — that it would sort of change some of the challenges of inequality and class structure in the United States.

Now maybe those were very high hopes, but this is the key concept in my book, and I think key to understanding the economics of our time. It’s the importance of platforms, which are any space or any institution that brings together buyers and sellers, speakers and listeners.

Every civilization has had platforms. I was in Rome a few weeks ago. And you go to the Roman Forum, and there it is — it’s all there together: the buyers, the sellers. They have the courts. They have places where people gave their speeches. They’re kind of the core of every civilization.

And at some level, I wrote this book because I was interested in this question of what our fundamental platforms look like and how that reflects on the civilization we are building. Because I do think they have a large impact. I think that’s kind of undeniable.

I think that things have gotten worse in many dimensions, and I guess it relates to my view of the state of the country, as well.

We’ve been in a better place in other periods of American history. And I think the internet is not the only cause, but I think it’s part of it.

Doctorow: If I were having this conversation with your producer and we had some time to talk about it, I would probably walk them through a couple of the undisputed ways in which some people have found the internet get worse for them.

So Tim has talked a little about margins for small businesses. There are also people who are performers who have found that the take that is being sucked out of their pay packet every month is going up and up from the platforms. There are people who would really not like to be snatched by ICE quads who installed ICE Block on their iPhone, only to have Tim Cook decide that ICE officers were a member of a protected class and removed that app, and now you can’t install that app because the iPhone only lets you install official apps. And I’d say that just because this hasn’t hit you, unless you have a theory about why you are favored by these platforms, then you should at least be worried that this could come.

I would follow up by saying: Let’s not fall into the trap of vulgar Thatcherism. Thatcher’s motto was: “There is no alternative.” And I think tech bosses would like you to believe that, too: that if you’re enjoying having a conversation on Facebook with your friends, which I stipulate, lots of people do — I think that’s absolutely the case, and we should value and celebrate that — that you just have to accept that there is no way to have a conversation with your friends that Mark Zuckerberg isn’t listening in on. And to ask for otherwise would be like asking for water that’s not wet — it’s just not possible.

What I’m militating for is not: You don’t like that thing you like. It’s: I like that you like the thing you like. I want to make it good, and I also want to guard it against getting worse. Because just because it hasn’t happened to you yet, it would be naive to think that it would never come for you.

Klein: Your books are two frameworks for understanding what I would call corporate capture of the internet, the way we went from the dream of a decentralized user-controlled internet to something that a small number of corporations really run and have enormous power over.

Tim, the term you focus on is “extraction.” Cory, the term you focus on is “enshittification.” So I’d like you both to define those terms for me. What is extraction, Tim? What is enshittification, Cory?

Wu: “Extraction” is actually a technical economic term that refers to the ability of any entity or any firm to take wealth or other resources far in excess of the value of the good being provided. And not only the value being provided, but also its cost to provide it. That’s the technical definition.

Outside of technology, you might have a pharmaceutical company. There’s a rare disease, and they have the only treatment for it. Maybe they’re extracting as much as they can — $100,000 a year is about the usual that they pin those kinds of diseases at. That’s the definition.

And I think the idea of it comes from a sense — something I get from teaching at a business school sometimes — is that American business has, in my view, moved increasingly to focus its efforts on trying to find points of extraction as a business model, as opposed to, say, improving the product or lowering the price: Try to find the pain points where your customers really have no choice. And then take as much as you can. Kind of like a poker game when you go all in because you have a good hand.

Now there’s always been a little bit of that in business, or maybe a lot, like in the Gilded Age. But the question is: What is the ratio? How much of business is providing good services, for good prices?

You know, making a profit, that’s fine. But how much is just that different thing of extraction?

Klein: So Tim, before we move on to Cory, I want to zoom in on something you said there, because a lot of that definition seemed to turn on how you define value.

A lot of economists would say price is a method of discovering value. If you have a pharmaceutical that people are willing to pay $70,000 for, that means they value it at $70,000, even if you think that is extractive.

So how do you know when a price, when a profit, is actually extractive? Versus when we’re seeing that people value that product very highly — and bully on the producer for creating something people value so highly?

Wu: So if someone, for example, has no choice, but they are desperate, let’s say, for water, and someone is able to sell them a bottle of water, because they’re dying, for $100,000 or something like that, then yes, that person does value it at that level.

Every economist would agree with this, that an economy full of nothing but maximized monopoly prices, where people are in a position to extract, is inefficient for two reasons. One is that too much money gets spent on that water versus other things, like maybe pursuing an education. And second, the entity that holds that much power actually has an impulse to reduce supply, reduce output and therefore produce less of the stuff so that they can extract the higher price.

This is just classic monopoly economics. Everyone inside themselves has something they’re willing to pay, but that doesn’t mean it’s a good society when you’re constantly paying the maximum you are willing to pay in every situation.

Klein: For Facebook, for TikTok — we’re not paying for them. So when you say they are extractive, what are they extracting and from whom?

Wu: When you use Facebook, you are constantly being mined for your time, attention and data in a way that is extraordinarily valuable and that yielded something like $87 billion in profit last year.

So things that feel free: Is it free when you suddenly spend hours wandering around random things you didn’t intend to? Is it free when you end up buying stuff that you didn’t really want and wonder why you got it later? Is it free when you feel that you’ve had your vulnerabilities exploited?

I would say none of that is free. You’re poorer both in your own consciousness and in terms of your attention and control over your life, and you’re poorer, probably, in misspent money.

Doctorow: I also wanted to react to something that you were sort of implying, Ezra, which is this idea of revealed preferences, which you often hear in these discussions. If you let Facebook spy on you, no matter what you say about how you feel about Facebook spying on you, you have a revealed preference.

Tim used the word “power” when he responded to that. I think that if you ask the neoclassicals, they’ll say: Well, we like models, and it’s hard to model qualitative aspects like power. So we just leave them out of the model and hope that it’s not an important factor.

This is how you get these incredibly bizarre conclusions, like: If you sell your kidney to make the rent, you have a revealed preference for having one kidney. But what we actually know when we give people choices, when the state intervenes or when there’s countervailing power, is that often you get a different revealed preference.

When Apple gave Facebook users the power to tick a box to opt out of Facebook’s spying, 96 percent of Apple users ticked that box. So the argument that Facebook users don’t mind being spied on, I think, is blown out of the water when you actually give them a way to express preferences. And I assume the other 4 percent were either drunk or Facebook employees — or drunk Facebook employees — which makes sense, because I would be drunk all the time if I worked at Facebook. But I think it’s hard to deny that people really don’t want to be spied on if they can avoid being spied on.

Klein: All right, I think that’s a good setup for enshittification.

Doctorow: Yeah. Enshittification — it’s really a label I hung on both an observation about a characteristic pattern of how platforms go bad, but I think much more important, why they’re going bad now. Because we didn’t invent greed in the middle of the last decade, so something has changed. My thesis is that some exogenous factors have changed.

So the pattern of platform decay is that platforms are first good to their end users while locking them in — that’s Stage 1. And once they know that the users have a hard time departing, when they face a collective action problem or when they have high switching costs, you can make things worse for the end users, safe in the knowledge that they’re unlikely to depart in order to lure in business customers by offering them a good deal.

And so far, so good. I think a lot of people would echo that, but they would stop there. They would say: Oh, you’re not paying for the product, so you are the product. So that this is about luring in users and then getting in business — customers will pay for it.

But that’s not where it stops, because the business customers are also getting screwed, because the business customers get locked in, and this power that the platforms end up with over their business customers is then expressed in Stage 3, where they extract from those business customers, as well.

They dial down the value left behind in the platform to the kind of minimum, homeopathic residue needed to keep the users locked to the platform and the businesses locked to the users. Everything else is split up among the executives and the shareholders. And that’s when the platform is a pile of [expletive].

But the more important part is why this is happening now. Broadly, my thesis is that platforms used to face consequences when they did things that were bad for their stakeholders.

And those consequences came in four forms. They had to worry about competitors, but we let them buy those. They had to worry about regulators. But when a sector is boiled down to a cartel, they find it very easy to agree on what they’re going to do and make their preferences felt because they have a lot of money because they’re not competing with one another, and they capture their regulators. They had to worry about their workers, because tech workers were in very scarce supply, and they were very valuable, and they often really cared about their users. And they could really say: No, I’m not going to enshittify that thing. I missed my mother’s funeral to ship on time — and make it stick because there was no one else to hire if they quit.

They were bringing a lot of value to the firm. But of course, tech workers famously thought that they were temporarily embarrassed founders, and they didn’t unionize, they didn’t think they were workers. So when the power of scarcity evaporated, they had not replaced it with the power of solidarity. So now you have 500,000 tech layoffs in three years, and tech workers can’t hold the line.

Then, finally, there was new market entry. There were new companies that could exploit something that I think is exceptional by tech. I’m not a tech exceptionalist broadly, but I’m an exceptionalist about this: which is that every program in your computer that is adverse to your interests can be neutralized with a program that is beneficial to your interests. That means that when you create a program that is deliberately bad, you invite new market entrants to make one that’s good.

If you lock up the printer so it won’t take generic ink, you just invite someone to not only get into the generic ink business but get into the alternative printer firmware business, which eventually could just be: I’m going to sell you your next printer business.

What we’ve done over 20-plus years is monotonically expand I.P. law until we’ve made most forms reverse engineering and modification without manufacturer permission illegal, a felony. My friend Jay Freeman calls it “felony contempt of business model.”

As a result, you don’t have to worry about market entry with this incredible, slippery dynamic character of technology.

And when you unshackle firms from these four forces of discipline, when they don’t have to worry about competitors or regulators or their work force or new market entry through interoperability, the same C.E.O.s go to the same giant switch on the wall, on the C-suite marked Enshittification, and they yank it as hard as they can as they’ve done every day that they’ve shown up for work.

And instead of being gummed up, it has been lubricated by an enshittogenic policy environment that allows it to go from zero to 100 with one pull. And that’s how we end up where we are today.

Klein: All right, I want to bring these out of theory — though, Cory, I applaud how well structured that was on the fly — and have you both walk through this with an example that you use in your books.

Cory, I want to start with you. Walk me through how you see enshittification as having played out on Facebook itself — not all of Meta, but Facebook. Where it started, when it was adding value to users in the early days to where you feel it has gone now. Tell me your Facebook story.

Doctorow: So Facebook — really, it’s Big Bang is 2006. That’s when they opened the platform to anyone, not just people with an .edu address from an American college. And Mark Zuckerberg needs to attract users. His problem is that they’re all using a platform called MySpace.

So he pitches those users, and he says: Look, I know you enjoy hanging out with your friend on MySpace, but nobody should want to use a surveillance-driven social media platform. Come to Facebook, and we’ll never spy on you. We’ll just show you the things that you asked to see.

So that’s Stage 1. But part of Stage 1, remember, is that there’s a lock-in. It’s just the collective-action problem. You love your friends, but they’re a pain in the [expletive], and if the six people in your group chat can’t agree on what bar to go to this Friday, you’re never going to agree on when it’s time to leave Facebook or where to go next, especially if some of you are there because that’s where the people with the same rare disease as you have are hanging out. And some of you are there because that’s where the people in the country you immigrated from are hanging out. And some of you are there because that’s where your customers or your audience is, or that’s just how you organize the car pool for the kids of Little League.

So we are locked in. And that ushers in Stage 2, making things worse for end users to make things better for business customers. So think about advertisers. Advertisers are told: Do you remember we told these rubes that we weren’t going to spy on them? Obviously, that was a lie. We spy on them from [expletive] to appetite. Give us pennies, and we will target ads to them with exquisite fidelity.

And so the advertisers pile in. Publishers pile in, too. They become locked to the platform. They become very dependent on it. And in Stage 3, advertisers find that ad prices have gone way up. Ad targeting fidelity has fallen through the floor. Ad fraud has exploded to levels that are almost incomprehensible.

Publishers famously now have to put their whole article there, not just an excerpt, and woe betide the publisher who has a link back to their website, because Facebook is downranking off-platform links as potentially malicious. So they don’t have any way to monetize that except through Facebook’s own system.

And we’ve got a feed that has basically been denuded of the things we’ve asked to see. It has the minimum calculated to keep us there. And this equilibrium is what Facebook wants, but it’s very brittle because the difference between: I hate Facebook — and: I can’t seem to stop coming here — and: I hate Facebook, and I’m never coming back — can be disrupted by something as simple as a livestream mass shooting. And then users bolt for the exits, the street gets nervous, the stock price starts to wobble, the founders panic. Although being technical people, they call it pivoting.

And one day, Mark Zuckerberg arises from his sarcophagus and says: Hearken unto me, brothers and sisters, for I’ve had a vision. I know I told you that the future would consist of arguing with your most racist uncle using this primitive text interface that I invented so I could nonconsensually rate the [expletive] of Harvard undergraduates, but actually, I’m going to transform you and everyone you love into a legless, sexless, low-polygon, heavily surveilled cartoon character so that I can imprison you in a virtual world I stole from a 25-year-old comedic dystopian cyberpunk novel that I call the Metaverse.

And that’s the final stage. That’s the giant pile of [expletive].

Klein: All right. Cory, you got a good rant there, my man.

[All chuckle.]

Wu: Corey could be a rapper if he decided to get into that line of business.

Doctorow: The world is crying out for a middle-aged technology critic rapper.

Klein: I think somebody could offer two counterarguments. First, for all the pivots, all the scams — by the way, I was a publisher during the era of the Facebook fire hose to publishers and the era of pivot-to-video when Facebook videos were getting these astonishing view counts — they kept all the money.

They promised everybody: Come get this huge scale, we’re giving you all this traffic, you can build a business here. There was no business to build there at any significant scale.

Second, it turned out that the video view counts were fraudulent. So a huge amount of the news industry, among other things, pivoted to video, and it was based on lies.

There was a recent Reuters report that Facebook was actually charging advertisers more for these things that they knew were scams.

Doctorow: Ten percent of their ad revenue is ads for scams by their own internal accounting.

Klein: I am really not here to defend Facebook as an actor. One of the crazy things amid all of this, that I think you really focused on there, was moving from showing us what we had asked to see to showing us what I would say Facebook wants us to see.

There was just the F.T.C. v. Meta case. Tim was, of course, involved in that. And one of the statistics that came out during it is that only 7 percent of time spent on Instagram is spent on things people you follow are showing you. Similarly, on Facebook itself, it’s under 20 percent. I forget the exact number, but it’s very low.

They have moved under competition from TikTok, specifically, although not only, to these A.I. driven algorithmic feeds showing you not what you have asked to see but what they find will keep you there. And what they are finding is that it will in fact keep you there, and people are coming back to it, and they spend more time on Instagram when you turn the feed into this algorithmic feed.

This is the whole reveal preference thing that you were talking about earlier. My personal experience of Instagram when I go on it now — and it’s one reason I try to go on it less — is I can actually feel how much more compelling it is. I like it less, but the feeling of getting pulled into something is much stronger.

So I think if you had Mark Zuckerberg arising from his, uh ——

Doctorow: Sarcophagus.

Klein: I was going to say “office” because I’m a more polite person.

[Doctorow laughs.]

Klein: He would say: We did this under competitive pressure. TikTok was eating our lunch. We stole a bunch of things from TikTok, and now we’re doing better. We also stole a bunch of things from Snapchat, and now we’re doing better. Because in fact, we are under a lot of competition, and we are incredibly good at responding to that competition in ways that our user base responds to. This is not enshittification. This is the magic of competition itself. And you know that because, actually, look at our profit margin, and look at how much we’ve changed.

Doctorow: So let me say that I don’t think competition is a good unto itself. And I think it is absolutely possible to compete to become the world’s most efficient human-rights violator.

The reason I like competition is because it makes firms into a rabble instead of a cartel. So in 2022, two teenagers reverse engineered Instagram, and they made an app called O.G. app. And the way the app worked is: You give it your login and password; it pretended to be you and logged into Instagram; it grabbed the session key; it grabbed everything in your Instagram feed; it discarded the ads; it discarded the suggestions; it discarded all of the stuff that wasn’t a chronological feed of the people you followed that they had posted recently. Facebook or Meta sent a letter to Apple and Google who obliged them by removing the app because there’s honor among thieves.

So if you want to find out what people actually prefer, you have to have a market in which people who disagree with the consensus that people are gut flora for immortal colony organisms we call limited liability corporations and that they’re entitled to dignity and moral consideration as beings unto themselves. Those people have to be offering some of the alternatives to find out what they want.

But because under modern I.P. law, something called the Digital Millennium Copyright Act, it is a felony to modify the app without permission. When Meta sent the letter to Apple and Google, they agreed they would side with Meta.

And because you can’t modify those platforms to accept apps that haven’t run through the store, that was the end of the road for the O.G. app.

Klein: But I think this is a little bit of a narrowed example. As somebody who gets a huge number of press releases for all these prosocial apps that are built to compete with Instagram and TikTok, and all of them — apps that are meant to respect your attention, apps that are meant to be virtuous in a way these apps are not — I watch as one after another, after another, after another basically go nowhere and get outcompeted. The point I’m making is in the example you are giving, they were able to say there was a term of service violation. Maybe they should not be allowed to do that.

But there are lots of things that emerge and are meant to be better or different or something. And this is where I want to make sure my producer has a voice. There are people who just absolutely like TikTok. There are people who like Instagram. They know there are other things out there, and they’re not clamoring for a competitor or an alternative. I think suggesting that there is no capacity to switch is going a little far.

Doctorow: No, I’m not saying there’s no capacity to switch. I’m saying the higher the switching costs are, the lower the likelihood that people will leave.

When we had pop-up ads in our browsers — real pop-up ads, the paleolithic pop-up ad that was a whole new browser window that spawned 1×1 pixel, autoplay audio, and ran away with your cursor — the way that we got rid of that was it was legal to modify browsers to have pop-up blockers. More than 50 percent of us have installed an ad blocker in our browser.

Doc Searls calls it the largest consumer boycott in human history. And as a result, there is some moderation upon the invasiveness of what a browser does to you that is in marked contrast with apps, because reverse engineering an app — because it’s not an open platform — is illegal under American copyright law. It violates Section 1201 of the Digital Millennium Copyright Act.

And so when we talk about how these platforms have competed their way into toxicity, we are excluding a form of competition that we have made illegal — for example: ad blockers, privacy blockers, things that discard algorithmic suggestions and so on. Taking those off the table means that the only competitors you get are firms that are capable of doing a sort of holus-bolus replacement to convince you: No, you don’t want to use Instagram anymore — you want to use TikTok instead. As opposed to: You’d like to use Instagram but in a slightly different way that defends your interests against the firm’s interests.

But I think that we mustn’t ever forget that within digital technology and living memory, we had a mode of competition that we prohibited that often served as a very rapid response to specifically the thing you’re worried about here.

I have a friend, Andrea Downing, who has the gene for breast cancer, and she’s part of a breast cancer pre-viver group that was courted by Facebook in the early 2010s. And they moved there, and this group is hugely consequential to them, because if you have the breast cancer gene, you are deciding whether to have your breasts removed, your ovaries removed. The women in your life, your daughters, your sisters, your mothers — they’re dying or sick, and you’re making care decisions. This group is hugely important.

Andrea discovered that you could enumerate the full membership of any Facebook group, whether or not you were a member of it. This is hugely important to her friends there. She reported it to Facebook. Facebook said: That’s a feature, not a bug. We won’t fix it. We’re going to keep it.

They sued. It was nonconsensually settled when the F.T.C. settled all the privacy claims, and they are still there because they cannot overcome the collective action problem that it takes to leave. Now they will eventually. When Facebook is terrible enough, that community will shatter, and maybe it will never reform. That is not a good outcome.

Klein: Tim, I want to go to your story here. One of the core tales you tell in your book is about Amazon. Walk me through the process of moving a platform from a kind of healthy, constructive platform to becoming an extractive platform through your story of what happened with Amazon.

Wu: So back in the 1990s, when people were thinking about what is going to be great about the internet, there was this serious promise that it was going to make a lot of people rich.

It was going to distribute wealth in new ways. And when people talked about how that was going to happen, a lot of it was like: Well, you know, everyone is going to be able to have their own store, sell stuff, and people are going make a lot of money that way. So that was a big promise, and I think it was an important promise.

Back in the day, it was mostly eBay that people were talking about. So then you have Amazon. Amazon was, you may remember, once upon a time a bookstore. [Chuckles.]

Klein: I do remember that, actually. That’s how old I am.

Wu: And their basic idea was to be bigger and to sell more stuff. At some point they opened the Amazon Marketplace, which was different because it was a platform. In other words, it was a place where people could come and sell their stuff. At first it was used books, and it spread into other markets.

They realized a few things. One is that fulfillment would be very important. In the old days, eBay sellers had to wrap it themselves and send it off, so that wasn’t a very scalable model. They had a good search engine — Amazon invested hard in search. And it worked. And more and more sellers came, more and more buyers came.

The Amazon Marketplace overtook eBay and became very successful. And at that point, I would say, maybe around 2010 or something like that, was fulfilling what I would call the dream of the internet age, which is that a lot of people will be able to go into this place, start their thing, make a lot of money. It coincides with the rise of the blog and small online magazines — that whole era that we are talking about.

During that period, Amazon’s take was below 20 percent — it depends how you count, but it was somewhere between 15 and 20 percent.

Klein: Their take of what a small business is selling?

Wu: Of the sales, yes. So if you sold $100, they’d take $20. I mean, it would depend a little bit — there were some storage fees, and so on. So it was a good place to make money.

What changed, I think, was once Amazon had confidence that it had its sellers and it had its buyers more or less locked up — and this was basically over the 2010s — they bought a couple of companies that were potential threats to them. Diapers.com, for example — it might seem ridiculous, but diapers could have been a way in to threaten them.

Klein: Why don’t you tell the Diapers.com story for a minute? It’s a kind of famous story on Amazon, but I think it’s worth telling.

Wu: So there was a platform launched to be an alternative to Amazon and their thought was, new parents, diapers, every parent needs diapers delivered quickly. So why don’t we make that the beginning — in the same way Amazon started with books. Then Amazon saw this, thought it was kind of threatening, and in the strategy of the day, just bought them. Of course the founders are pretty happy.

And Amazon managed to basically capture this market. That’s when I think it turned to the extraction phase.

Over the last 10 years, Amazon’s strategy has just basically been for its marketplace to turn the screws and increase the fees, change the margins, so that many sellers are paying over 50 percent or more — basically the same as brick-and-mortar businesses. And Amazon prices are rarely any lower. They have done a lot to try to prevent anyone from pricing lower.

And I think the one thing I would focus on is what they call advertising, which may be familiar to you as sort of the sponsored results that you get when you’re searching. What’s going on there is that sellers are bidding against each other, bidding down their own margins, to get higher up in the search results. And that little trick — that sort of one weird trick — has become this extraordinary cash cow. It’s more profitable than Amazon Web Services, which is sort of surprising. Last year, it was $56 billion they made ——

Klein: Just paying Amazon for higher rankings in those search results was $56 billion?

Wu: Yes, $56 billion. It’s looking like it’s going to be over $70 billion.

Klein: Cory, when I’m searching on Amazon, and I see that “Amazon’s Choice” looks like a little prize, like that product won a competition where a bunch of editors chose it as the best one, what am I looking at there?

Doctorow: So that is broadly part of this thing Tim is discussing, where they’re piling on junk fees for the right to be at the top of the results. It’s more subtle, where if you’re not paying for Prime and paying for fulfillment by Amazon and paying for all these other things, you aren’t eligible. And the more of these you buy, the greater chance you have of being chosen.

Klein: But are they literally paying to be Amazon’s top choice? I mean, as a dumb consumer, maybe I look at that and I think: Oh, this is some algorithmic combination of: Is it the best seller? What are its reviews?

Doctorow: So you’re right that it is algorithmic, but the algorithmic inputs are not grounded primarily in things like quality or customer satisfaction. They’re grounded in how many different ways you’ve made your business dependent on Amazon, in such a way that every dollar you make is having more and more of that dollar extracted by Amazon.

There’s some good empirical work on this from Mariana Mazzucato and Tim O’Reilly. They calculate that the first result on an Amazon search engine results page on average is 17 percent more expensive than the best match for your search.

So that’s what you’re seeing, basically: The Amazon top choice is the worst choice.

Klein: This really feels to me like a place where, to use Cory’s word, things enshittified. When I go around the internet now, when I play something on a Spotify playlist or click on a song I like and move to the radio version of Spotify or when I search something on Google or when I search something on Amazon, these used to be very valuable services to me.

To search for something on Amazon and see rankings weighted by how popular the product is, how high the reviews are. I took the weighting of the search as, to some degree, a signal of quality. Certainly Google, the whole idea was that what comes first in search was built on page rank, and it was going to be quality. Spotify had an algorithmic dimension to it, but it was supposed to be showing me things that people like me would like to listen to.

And now, there is so much sponsored content in every one of these results, and it is so unclear what is what and who is paying for what and why I am getting this song or that result.

One reason I ended up on these platforms is because I trusted these results, and now I trust nothing.

Wu: I mean, going back to the definition of extraction, we are kind of paying $70 billion collectively to make search worse.

Klein: So when does this move from: This is just their business model, and if you want to find something else, go buy something at Walmart, go buy something at Target, go buy something at Best Buy. You can do all those. I’ve done all those. I just ordered a blender from Kohl’s. To: We’ve moved to extraction, and we should see it as a public policy problem.

Wu: I think that’s a really great question. It’s the kind of question we’ve faced, I think, repeatedly in history. When you have a business model start to settle down, you see less real disruptive competition possible. I think at some level, once a market has settled, at some point you’ve got to call a limit. We do that in many other markets.

And Amazon is still a great way to find a lot of products. It’s the world’s largest marketplace. But I would say they’re running themselves like an unregulated monopoly.

Klein: Both of you spend a lot of time on the number of small acquisitions that these companies make. And maybe many of them get shut down or they acqui-hire the top people, but there are also things that might have grown into something bigger.

On the other side, sometimes it really is the case that a big player buying something smaller can scale it up into something new. This was actually a fairly big acquisition, but Google bought Waymo, and kind of amazingly, they seem to have made driverless cars work. And I think access to Google’s Compute and other things was not insignificant in that.

You can look at other cases where these companies buying something small, they’re able to build it into something that ends up being a great option in Microsoft Office or in Google Docs or whatever it might be. So how do you think about the ways in which that harms competition?

I’ve known founders who get acquired and are excited to get acquired because they think it will give them scale and the capacity to compete in a way they wouldn’t versus Google just trying to do it itself. I think the antitrust level is one thing, but the sort of anticompetitive versus pro scale level is a much bigger challenge the way Silicon Valley now works.

I’m curious to hear you talk through the pros and the cons of that.

Wu: I think that’s a really great question. Joseph Schumpeter, back in 1911, wrote a book about entrepreneurs, basically. And he said these very unusual people were willing to go out and take these kinds of risks, they have some vision, they do this kind of thing. He thought they were essential to economic growth, that they were these unusual, almost superheroes and would do these things and go out and take these chances.

The United States economy, in general, has thrived because it has a lot of those kinds of individuals, and they can start things. But I think we’ve erred too far in having all the brains under one roof.

It’s starting to remind me of AT&T in the 1960s or IBM, where they sort of became much more centralized about innovation, and big ideas would never be developed. It became kind of groupthinky.

When the Justice Department sued AT&T and tried to break them up, they forced AT&T to stay out of computing forever and also licensed all of their patents, including the transistor patent, and all kinds of people started quitting their jobs and saying: I’m going to start a semiconductor firm.

There lies the origins of U.S. semiconductors. And also, frankly, U.S. computing without AT&T.

So I think we have done much better with divided technological leadership. I frankly think that large language models might never have gotten started without OpenAI being an alternative force because they’re obviously threatening to Google’s business model.

Klein: Although, don’t you have to give Google some credit on L.L.M.s, specifically? You were talking about transistors a minute ago. Google does the fundamental research in transformers and releases it publicly.

Doctorow: But doesn’t do anything with it internally until there’s a competitor.

Klein: That’s right. It’s just striking how good of an actor they were for a period on A.I., specifically, treating it like they had a Bell Labs.

Wu: I agree with that. It actually is a lot like Bell Labs in the sense that Bell Labs kept inventing stuff. I mean, Bell Labs collected a lot of amazing people and then never let things come to market, the internet being probably the best example of it.

Doctorow: Yes, I think when you look at these companies and their acquisitions, what you see is that these companies very quickly suffer from what both Brandeis and Tim called “the curse of bigness.” They find it very hard to bring an actual product to market that they invent in-house.

When you look at Google, they’ve had one really successful consumer-facing product launch, and that was in the previous millennium. And almost everything they made, in this millennium, failed. It either didn’t launch, or after it launched, they shut it down. Whereas their giant successes — their video stack, their ad tech stack, documents, collaboration, maps, their navigation, server management, mobile, all of this stuff — these are companies they acquired from someone else and operationalized.

And I’m an ex ops guy. I’m a recovering sysadmin. So I’m not going to say that’s nothing. It is a skill unto itself, the careful work to make things work and make them resilient ——

And scale them. But the idea that has to happen under one roof, I think, is a false binary.

One of the things Google did arguably far more efficiently than they hired innovators is they hired operations people. Those are the people who really do the yeoman’s service at Google, because the innovators, the product managers, never get to launch. They only get to buy other people’s products and refine them.

Wu: It comes down to what you think of as the track record, I guess, of monopolized innovation. It has some hits, but I’m saying a much more mixed model, I think, historically, is a lot stronger.

If you look at the entire track record of U.S. innovation — I think monopoly innovation leads you toward an AT&T, Boeing, General Motors kind of model as opposed to what the best of Silicon Valley has been.

Doctorow: Meanwhile, you mentioned acqui-hires. For people who aren’t unfortunate enough to be steeped in the business of Silicon Valley, an acqui-hire is when a company is purchased, not for the product it makes but because the team that made it have proved they can make a product. And then they shut down the product, and they hire the team.

And acqui-hires are, I think, a leading indicator of pathology in tech and investment. An acqui-hire is basically a postgrad project where venture capitalists sink some money into you, pretending that you’re going to make a product — it’s a science-fair demo — in the hopes that the company will buy you. And in lieu of a hiring bonus, we’ll give you stock. And in lieu of a finder’s fee, we’ll give them stock. But no one is trying to actually capitalize a product or a business.

I think anytime you see a preponderance of acqui-hires in your economy, that should tell you that you need to sit down and figure out how to rejigger the incentives, because your economy is sick.

Klein: Cory, we’ve been talking here about these markets as really having two players in them, maybe three — we’ve been talking about users, sellers and platforms. But something that your book focuses quite a bit on is a fourth that we need to talk about, too, which is labor.

There are huge numbers of people working for these companies, huge numbers of people delivering Amazon packages and Walmart packages. And one thing that both of you focus on is the way in which, as these companies become bigger and more dominant, their labor practices can become — I don’t know if “enshittification” is the term you would use there, but [expletive] or more extractive.

Can you talk a bit about that side of it? What has happened to the labor practices?

Doctorow: We could talk about the other tech workers, right? The majority of tech workers drive for Uber or for Amazon or work at a warehouse. And they certainly don’t get free kombucha and massages and a surgeon who will freeze their eggs so they can work through their fertile years. They’re in a factory in China with suicide nets around it.

An example that kind of pulls this all together — how you get monopoly, regulatory capture, the degradation of labor, with technology that relies on blocks on interoperability — I think we could do no better than to talk about nurses. And I’m going to be making a reference here to the work of Veena Dubal, a legal scholar who coined a very important term: “algorithmic wage discrimination.”

In America, hospitals preferentially hire nurses through apps. And they do so as contractors. Hiring contractors means that you can avoid the unionization of nurses. And when a nurse signs on to get a shift through one of these apps, the app is able to buy the nurse’s credit history.

The reason for that is that the U.S. government has not passed a new federal consumer privacy law since 1988, when Ronald Reagan signed a law that made it illegal for video store clerks to disclose your VHS rental habits.

Every other form of privacy invasion of your consumer rights is lawful under federal law. So among the things that data brokers will sell to anyone who shows up with a credit card is how much credit card debt any other person is carrying, and how delinquent it is.

Based on that, the nurses are charged a kind of desperation premium. The more debt they’re carrying, the more overdue that debt is, the lower the wage that they’re offered, on the grounds that nurses who are facing economic privation and desperation will accept a lower wage to do the same job.

Now this is not a novel insight. Paying more desperate workers less money is a thing that you can find in, like, Tennessee Ernie Ford songs about 19th-century coal bosses. The difference is that if you’re a 19th-century coal boss who wants to figure out how much the lowest wage each coal miner you’re hiring is willing to take, you have to have an army of Pinkertons who are figuring out the economic situation of every coal miner, and you have to have another army of guys in green eye shades who are making annotations to the ledger where you’re calculating their pay packet. It’s just not practical. So automation makes this possible.

And you have this vicious cycle where the poorer a nurse is, the poorer they become, the lower the wage they’re offered, and as they accumulate more consumer debt, their wage is continuously eroded. And I think we can all understand intuitively why this is unfair and why as a nurse you might not want it. But also, do you really want your catheter inserted by someone who drove an Uber until midnight the night before and skipped breakfast this morning so they could make rent?

This is the thing that makes everyone except one parochial interest worse off. And this is not a free-floating economic proposition. This is the result of specific policy choices taken in living memory by named individuals who were warned at the time that this would be the likely outcome, and who did it anyway.

Klein: I think this is getting at something we’re starting to hear a lot about, which is anger over algorithmic pricing of various kinds. When I was walking up to do the podcast today, the chyron on CNN was about an investigation finding that Instacart was charging many different people many different prices. So the price you were seeing on Instacart wasn’t the price, it’s your price.

And I could imagine a neoclassical economist sitting in my seat right now and saying that pricing becomes more efficient when it discriminates. That the market will be more efficient if it can charge Ezra a higher price for kombucha, if I’m getting that delivered, because of things it knows about me and my kombucha habits, and it charges somebody else a lower price because it knows they value the kombucha less. Or a nurse a higher wage and a lower wage depending on their situation. That, in fact, we’re just getting better and better and better at finding the market-clearing price.

And this is what economics always wanted. We’re finally hitting the utopia of every person having the market-clearing wage and the market-clearing price.

Why don’t you agree with that?

Wu: The fundamental question is: Is that really the kind of world you want to live in? In other words, do you constantly want to live in a place where you are being charged the maximum you would pay for something?

Now that could redound to the benefit of people who are very poor. But in economic terms, it is always only about producers taking everything from the market. And, moving away from the efficiency of it, I think it makes for a very unpleasant lifestyle to be constantly feeling you’re being exploited.

The other thing I’ll say is there’s also a huge amount of effort people make trying to move the category they’re in and pretend to be poor. So I think it is overrated and relies on overly simplistic models of what makes people happy.

Klein: There’s a way in which efficiency, I think, is an interesting term in economics, because in economics, as in life, you want things to be somewhat efficient, but too much efficiency becomes truly inhuman.

I find this even in the very modest example of personal productivity efforts. It’s great to have a to-do list. If I really force myself onto the scaffolding of a to-do list at all times, I feel like I cease to be a human being and become a kind of machine, always just getting things done and responding to the emails.

And this is a place, I think it was important, Tim, when you said it raises the question of what kind of world you want to live in. Because the truth is that I don’t want to live in a maximally efficient world. I have other competing values. The competitive efficient market is good up to a point, and after a point, it becomes something corrosive to human bonds and human solidarity. Just-in-time scheduling makes sense from the perspective of economic efficiency, but not if you want healthy families in your society.

And I think being able to articulate that question of what kind of world you want to live in, not just what kind of economy works on models, is important and often a lost political art, in my view.

Wu: Yes, I agree. And I feel there are some intuitive feelings, like people feel it’s unfair, people don’t like being ripped off, people hate paying junk fees. The original word for that, by the way, was “[expletive] fees,” but inside government, we felt we couldn’t have the president say that. [Chuckles.]

So yes, I think that gets to the heart of the matter. You had also talked about human attention, and human attention turns out to be quite commercially valuable. But do you really want every second of your time and every space you inhabit to be mined for your attention and its maximum value, even if that contributes to the, I guess, overall gross domestic product of the economy? I mean, I’d like to have some time for my kids and friends in which no one is making any money.

And it’s an example of a commodity that is very close to who we are. At the end of your days, what your life was is what you paid attention to. The idea that you can, with maximum efficiency, mine that at every possible moment seems to me a recipe for a very bad life.

Doctorow: I think one way to frame this, rather than around efficiency, is around optimization. And I think that we can understand that for a firm, the optimal arrangement is one in which they pay nothing for their inputs and charge everything for their outputs.

So optimization: Things are optimal from the perspective of the firm when they can discover who is most desperate and pay them as little as possible — or who is most desperate and charge them as much as possible. But from the perspective of the users and the suppliers, things are optimal when you get paid as much as possible and are charged as little as possible. And so much of the specific neurological injury that arises from getting an economics degree is organized around never asking the question: Optimal for whom?

I mentioned before that we don’t have any privacy law in this country. One of the things that a privacy law would let us do is to become unoptimizable.

All optimization starts with surveillance. Whether it’s things like TikTok trying to entice your kids into spending more time than they want to spend there or whether that’s advertisers finding ways to follow you around and hit you up with things that you’re desperate for or whether it’s discrimination in hiring or in lending, all of this stuff starts with an unregulated surveillance sector.

We have platforms that take our data and then sell it and use it and recycle it and become sort of the Lakota of information, where they use the whole surveillance package, and we do nothing to curb that behavior. It is not an incredible imaginative lift to say that we might tell them to stop.

Klein: I want to pick up on surveillance because when you talk about the harms to an economy working in a human way, I think that the new frontiers in how you can surveil workers is going to become a very big political issue, and probably should be already.

Doctorow: I agree. The category that this falls into, it’s broadly called bossware. And there’s a whole lot of different versions of it. If your firm buys Microsoft 365, Microsoft will offer your boss the ability to stack rank divisions within your firm by how often they move the mouse and how many typos they make and how many words they type.

And then — this is amazing — they will tell you how you perform against similar firms in your sector, which is the most amazing thing I can imagine, that Microsoft is finding customers for a sales pitch that says: We will show you sensitive internal information about your competitors — and apparently none of those people are like: Wait, doesn’t that mean you’re going to show my competitors sensitive commercial information about me? So you have this on the broad strokes level.

But I have this notion I call the [expletive] technology adoption curve. If you’ve got a really terrible idea that involves technology that’s incredibly harmful to the people it has imposed on, you can’t start with me. I’m a mouthy, white, middle class guy with a megaphone. And when I get angry, other people find out about it. You have to find people without social power, and you grind down the rough edges on their bodies.

You start with prisoners. You start with people in mental asylums. You start with refugees. And then you work your way up to kids. And then high school kids, blue collar workers and pink collar workers and then white collar workers. And it starts with: The only people who eat dinner under a CCTV are in supermax. And 20 years later it’s like: No, you were just dumb enough to buy a home camera from Apple or Google or — God help us all — Facebook. That is the [expletive] technology adoption curve.

And if you want to know what the future of workers is, you look at the least privileged workers at the bottom. Then you see that technology working its way up.

If you look at drivers for Amazon, they have all these sensors pointed at their faces, sensors studded around the van. They’re not given a long enough break even to deal with things like period hygiene, so women who drive for Amazon, who go into the back of the van to deal with their periods, discover that’s all on camera because that’s all being recorded. All of this stuff is subject to both manual and automated analytics.

At one point, Amazon was docking drivers for driving with their mouth open because that might lead to distraction while driving. And so as you say, it kind of denudes you of all dignity. It really is very grim.

You know, Tim and I used to ride the Toronto Transit Commission buses to school in the morning when we were in elementary school. And we loved the drivers, who would sing and tell jokes and remember you. This is the thing that makes working in the world, being in the world, great. It’s having a human relationship with other humans, not having standardized labor units that have been automated and standardized to the point where they can be swapped out.

If you give a cashier a cash register instead of making them add up things on paper, you could give them the surplus to talk with the customers and have a human relationship with them. Or you could speed them up so that you fire nine-tenths of the cashiers, and you take the remainder and you make them work at such an accelerated pace that they can’t even make eye contact.

Klein: Tim, there were things in Cory’s answer there that, in my view, we should just make a social decision to outlaw. Like, I am willing to say, politically, I want to vote for the people who think you can’t eyeball-surveil workers. And if other people want to stand up and say the surveillance of workers’ eyeballs is great, that’s a good values debate to have in a democracy, and I know where I fall on that.

Then there are other things. I’ll build on the cash register example to say that I really struggle with what, as a public policy measure, one should think about the rise of automated checkout and the way we’ve seen it. I’ve watched people turned into these managers of machines. They’ve gone from being somebody who did checkout with me and asked me how my day was and I asked them how their day was, and now they get called over because the three apples I put on the weighing machine didn’t weigh in correctly. And it seems dehumanizing to them, dehumanizing to me. I also get it.

How do you think about weighing that? There’s the stuff that is genuinely grim and dystopic and maybe we should just outlaw, and then there’s stuff like generalized automation where there genuinely can be a consumer surplus. Like, time is a surplus for me. Things moving faster is a surplus for me. More checkout stations is a surplus for me. But there’s a cost on the other side of it.

Wu: Well, the first thing I’d say is we should be making more of these kinds of decisions about what we really care about and what kind of world we want to inhabit.

One of the things that I think happens is by default. We don’t pass any laws or have new ethical codes. I mean, ethics does a lot of work, and we just sort of allow a trump card to new stuff because it’s new. And I get that you don’t want to ban everything new that shows up, but I feel that we have, over the last 15 years or so, sometimes just kind of taken a position that the people don’t get to vote on this.

A good example is everything to do with children. I don’t think there are a lot of people who think it’s a great thing to surveil children and have targeted ads for children and try to create addictive technologies for children.

When I worked in government, we tried to pass just basic child privacy laws. We couldn’t get a vote — ever. And so one of the things that’s going on is we’re not even deciding these things as society, and that gets to the problem of Congress not taking votes on popular issues.

I also think this relates to our conversation earlier about competition and when it’s good and when it’s bad. Because I think for almost any endeavor, there’s such a thing as healthy competition and such a thing as toxic competition.

We were talking about attention markets earlier. What is good, healthy competition in the attention markets? It’s making really great movies, new TV shows that people love, podcasts that people want to listen to. Toxic competition is the stuff you’re talking about, essentially different forms of manipulation and addiction.

We’ve had this hands-off, we cannot try to direct things in a positive direction approach. I think that has been a giant mistake. So first, I would say we have to try to make the decisions.

How would I do the trade-off? I guess I would start with the most unredeeming toxic stuff and ban that first. I mean, that’s maybe easy, but we haven’t been able to even do that.

And I was sort of shocked when I worked in government that we just could not get a vote on what seemed like stuff like privacy laws, the basics of anti-surveillance law. Even national security was really into this stuff.

They’re like: It’s too easy to spy on everybody, and that’s a problem for us as a national security issue. And we just could not get a vote on even the most basic anti-surveillance, which would suggest, like, if you download a dog-walking app, it shouldn’t be tracking you and uploading every kind of information about you. That should be illegal.

Klein: I have been very disturbed we’ve not been able to do more on surveillance and privacy, and I’ve also been struck by how badly what has been done elsewhere seems to have worked out. I call this terms and conditions capitalism, where you just move the burden onto the consumer.

So Europe has put out some very sweeping rules that have given me the opportunity to individually decide which of the 303 cookies on every website I visit might be good or might be bad. Similarly, nobody has ever, in my view, to a first approximation, read an iOS terms and conditions update.

And I have found that very often, it seems to me, where policymakers end up after the debate saying: Well, as long as there is disclosure, then the consumer can decide.

But the consumer, in a very rational way, does not want to decide. So it has ended up in a very dispiriting place. Instead of creating a structure in which I’m confident what companies are doing is well bounded, it has demanded of me a level of cognitive work I’m not willing to do. And I think nobody else is willing to do. To oversee those companies myself — with no really great options if I don’t like what they’re doing.

I’m curious how you think about that.

Wu: I couldn’t agree more. I feel like if the byproduct of government action is that you are clicking on more little windows, that is government failure. And I would trace it to, frankly, a lack of courage on the part of government and the regulators or the officials to make decisions that are really supposed to help people. It’s much easier to say: Well, I’m afraid to do something, so I’m going to help them decide.

So I agree. I think the G.D.P.R. has actually failed to prevent surveillance.

Klein: That being the European bill that created all those pop-ups.

Wu: Yeah. [Chuckles.] G.D.P.R., the European privacy laws succeeded in creating a lot of pop-ups and things to mess with. It succeeded in making it harder to challenge big tech companies in Europe, because they’re overregulated, and so the little guys have to also go through all this stuff.

So yes, I think this has been a failure. I think for people to start to believe in government again, it has to help us in situations where we are not strong enough to deal with something much more powerful or something that has a lot more time to think about it. It’s like we’re playing poker against experts.

At some point we need to get backbone and have government on people’s side. Now I’m starting to sound like a politician helping people when they are powerless or distracted or don’t have energy to deal with things.

Klein: Cory?

Doctorow: So look, I love you both, but I think you’re dead wrong about the G.D.P.R., just as a factual matter, about where it comes from, what it permits, what it prohibits and why it failed. Because I agree — it failed.

So you may ask yourself: How is it that G.D.P.R. compliance consists of a bunch of cookie compliance dialogues? And the answer to that is that European federalism allows tax havens to function within the federation. One of the most notorious of those is Ireland. And almost every American tech company, except for Amazon, pretends that it’s Irish, so that its profits can float in a state of untaxable grace in the Irish Sea.

And because of the nature of the G.D.P.R., enforcement for these [expletive] cookie pop-ups, which are the progeny of the big American tech companies, starts in Dublin with the Irish data protection commissioners, who, to a first approximation, do nothing.

Klein: That sounds bad, but I want to get you to explain the core mechanism you’re describing here better, because I actually don’t know it. That bill did pass, and then all of a sudden the entire internet filled with these pop-ups.

Doctorow: That’s only because the companies went to Ireland, broke the law and said: We’re not breaking the law. And if you disagree, you have to ask the Irish data protection commissioners to enforce against us.

A few people — Johnny Ryan with the Irish Council for Civil Liberties, Max Schrems with N.O.Y.B., or None of Your Business, this European nonprofit — they drag some of those cases to Germany. More important, they have gotten the European Commission to start modifying the way the law works. So you can just tick a box in your browser preferences, and it can come turned on by default that says: I don’t want to be spied on. And then they’re not allowed to ask you. I mean, the answer is just going to be no.

And so I think that corporations want you to think that it is transcendentally hard to write a good law that bans companies from collecting data on you. And what they mean is: It’s transcendentally hard to police monopolies once they’ve attained monopoly status because they are more powerful than governments.

And if that’s their message, then a lot of us would be like: We need to do something. We need to turn the cartel into a rabble again. As opposed to: God, I guess governments just have no role in solving this problem.

Klein: The one place where I do disagree with you, having covered a lot of different cartels and rabble lobbying Congress. It’s not easy to regulate the association of community banks, for instance. When you have something where there are, in every single district, individual leaders of the district who will come and lobby their member of Congress, it’s really hard.

I’m not saying that monopolies are good because they make it easier to regulate. I’m just saying that it doesn’t solve the problem. The government runs on money and influence.

Doctorow: Can we agree on necessary but insufficient?

Klein: Yes, we can do that.

I want to build on this and ask Tim about a separate but related question. Tim, you mentioned the entertainment industry, and one of the questions about to come up is whether Netflix should be able to buy all of the assets — or all the entertainment assets, I should say — of Warner Bros. Discovery. And this is one where I think people who care about the quality of the media we consume seem — for reasons that seem compelling to me — very, very worried about having that happen.

How would you think about that? And is this a place where we need to be, say, making values judgments that are different than our antitrust judgments? Is this a place where the antitrust laws can suffice? Is everybody just worried about something they don’t need to be worried about? How do you see it?

Wu: I think this is a place where, if the antitrust laws are enforced correctly and fairly, the acquisition would be blocked. And I’d say that this is not a particularly exotic situation in the sense that you have the No. 1 premium streaming company wanting to buy the No. 3 or No. 4. And if you do the numbers under the guidelines, which the government issues and which tell people when their mergers are presumptively illegal, the result is that this is a presumptively illegal merger.

The reason I do think it’s bad is I think that Netflix and Warner Bros. Discovery have, frankly, over their history, been some of the most innovative, interesting outlets, and often in an oppositional role. This goes way back, but Warner Bros. Discovery took a chance on sound film back in the 1920s. In the ’50s, they took a chance on television, which people thought was useless. And then prestige television in the early 2000s with HBO and the golden age. So they’ve taken a lot of bets.

Netflix has done a lot of innovative stuff — really interesting, obviously. And frankly, you want to talk about good tech over the last 20 years: How about not having to wait until your show comes on? That’s a form of efficiency I can agree with.

I think it would be a tragedy to have these two companies, which are often so oppositional, combined into one. I think culturally it would be a great mushification. At the economic level, just to continue on this, I think it’s usually going to be those two companies that are bidding for the most interesting shows. So if you had a new version of “The White Lotus” or “The Wire,” who is going to be bidding for it? It’s going to be HBO and Netflix. So the elimination of one bidder is just the definition of a loss of useful competition.

So yes, I think it’s pretty straightforwardly illegal. I don’t think it’s that complicated.

Klein: Cory, you look like you wanted to jump in on that.

Doctorow: I think that one of the things we should probably anticipate Warner Bros. Discovery saying in defense of this merger is the same thing that Simon & Schuster and Penguin Random House said in defense of their failed merger that was blocked under the Biden administration. They said: Oh, well, we’ll still internally bid against one another within our divisions for the most premium material. We’ll be exposed to discipline that way.

And I love what Stephen King had to say about this when he testified. He said; That’s like me and my wife promising to both bid against each other on the next house we move into.

Klein: Tim, one thing I was thinking about while I was reading your book was the metaphor you use of a gardener. The way to think about economic regulation and antitrust and a bunch of the different buckets of solutions we’re talking about is: It is like a gardener who is trying to prune certain species and plants from taking over their garden, and the gardener has to make judgments. There are some decisions you make as a gardener where you don’t want blight getting all of your garden and killing everything, but others are made for aesthetic reasons, and others are made because you want to have native species and noninvasive species. There are all these decisions being made.

Having been around conversations of economic regulation and tech regulation for a long time, I’ve come to this view that there is a fetish in them for truly neutral rules. What people always seem to be looking for is a rule that you don’t have to apply any judgment on. You can just say: If you get over this line, everybody knows it’s bad. As opposed to actually having to say: We have views about how the economy should work. We have views about how our society should work. We want the interest of small businesses to prosper, and they’ll prosper more if they don’t have to give 30 cents of every dollar to Apple or Google or, if you’re selling on the Facebook Marketplace, Facebook.

And yet, you’ve been a policymaker, Tim. I think that there has been a defensive crouch, particularly among Democrats — and Lina Khan and others were an exception to this — an effort to describe everything neutrally when sometimes you just don’t want to be neutral on how fundamental companies and markets in your economy are working. You want to be able to have values that those serve, as opposed to your values being subservient to your economy.

Wu: Yes, I know. I agree with that. And I think it’s an astute observation. I think it comes, as I said earlier, from a lack of courage or vision. It reminds me of when you were talking about: Well, OK, we’ll just create a bunch of windows and let everybody decide what options they want for their privacy and hope that works. It comes from that same impulse that we don’t actually want to arrive at a vision of the good society. It’s one of the flaws of classic liberalism, frankly, if you get into the political theory.

And frankly, the gardener metaphor is targeted. It’s not just like: Let it all run and see what happens. It is one where you have some idea of what kind of world we want to live in and what kind of society we think is good, and you have to make decisions based on that.

I think we need a vision of what we want and what a good country looks like and a good place to live.

Doctorow: The thing we really want to be asking before we ask any of these other questions is: How often are you going to have to answer this question?

So lots of people are like: Oh, we should just ban hate speech and harassment on platforms. Well, that’s hard. Not because we shouldn’t do it, but because agreeing on what hate speech is, agreeing on whether a given act is hate speech, agreeing on whether the platform took sufficient technical countermeasures to prevent it, is the kind of thing you might spend five years on — and hate speech happens a hundred times a minute on platforms.

Meanwhile, if we said: We are going to have a bright line rule that platforms must allow people to leave but continue to communicate with the people they want to hear from, then people who are subjected to hate speech, who are currently there because the only thing worse than being a member of a disfavored and abused minority is being a member or disfavored abused minority who is isolated from your community, those people could leave and go somewhere else.

And it’s not that we shouldn’t continue to work on hate speech in parallel, but if you think that a rule that takes three years to answer a question is going to solve a problem that happens a hundred times a second, you’re implicitly committing to full employment for every lawyer in the world to just answer this question.

Klein: One thing I admire about both of your books is that you spend a lot of time on solutions. I don’t think we can go through every one, but let me do it this way: For each of you, and Cory, why don’t we start with you: If you were king for a day, what are the three areas or the three policies — you can define it the way you want — that you think would make the most difference?

Doctorow: Sure. One would be getting rid of this anti-circumvention law in America — it’s Section 1201 of the Digital Money Copyright Act — and saying that it should be legal to modify things you own, to do things that are legal, and that it shouldn’t be the purview of the manufacturer to stop you from doing it.

Another one would be to create a muscular federal privacy right with a private right of action so that impact litigators like the Electronic Frontier Foundation as well as aggrieved individuals could bring cases when their privacy laws were violated.

And I guess the third would be an interoperability mandate specifically for social media. We’ve had versions of this — the ACCESS Act was introduced, I think, three times. Various versions, they’re all pretty good. Mark Warner, I think, was the main senator behind them. But a rule that just says that you should be able to leave a social media network and go to another one and continue to receive the messages people send to you and reply to them, the same way you can leave one phone carrier and go to the other. And there are a lot of technical details about what that standard looks like and how you avoid embedding parochial interests of incumbents and so on. I don’t think they’re insurmountable. And I think that the trade-offs are more than worth it.

Klein: Tim?

Wu: So I’ll say three things. First, I think we need the confidence to ban the worst and most toxic business models that are out there, whether it’s exploitation of children, whether, frankly, it’s some of this total, absolute price discrimination you’re talking about, which may technically already be illegal.

Second, I think that it’s unquestionable that the main tech platforms have become essential to commerce. I’m not in any way thinking you can do without them. So I think we need to understand which of them need to be treated more like utilities and which of them need to be not allowed to discriminate in favor of themselves or between customers, to try to maximize their extraction.

Klein: Can I hold you on that one for a minute?

Wu: Yes, sure.

Klein: Because when I hear this, it makes sense to me. And then I think to myself: Do the people I know who focus on how utilities act and are regulated seem happy with the situation? And the answer is no, they all think it’s a total disaster.

So when you say they should be treated as utilities, but you worked in the Biden administration, and you know that everybody who works on, say, green energy will tell you that the models and regulatory structures of the utilities is a huge, huge, huge problem, what specifically do you mean?

Wu: It’s a good question, and I’ve spent a lot of my life exposed to that. But I think what’s important about utility regulation is what it doesn’t allow to happen.

Like, the electric networks, the electric utility regulators, are not perfect. On the other hand, if you think about the electric network, it has been an extraordinary foundation for people to build stuff on. And the reason they’re able to build on it is they don’t think the electric network is going to take half their profits if you invent the computer on top of it. Or they don’t think that, for example, the electric network is going to decide that it likes Samsung toasters instead of LG or, I don’t know, Zenith’s toaster — something like that. So they don’t discriminate between manufacturers on the electric network.

So I think we need to understand and look carefully at which parts of the platforms are most like the electric network or the broadband network, where they are essential to the rest of business and therefore need to play by different rules.

And some of those main rules — the most obvious are duties of treating everybody the same, so they don’t play favorites. And then if you’ve got it figured out, you get to the question of price regulation. Maybe Amazon’s margin would be capped at 30 percent or something like that.

Klein: And then No. 3 for you?

Wu: No. 3 — you know, I’m an antimonopoly kind of guy. I think we need constant pressure on the main tech platforms so that they stay, I guess, insecure in their position and aren’t able to easily counter new forms of competition. I think you have to take out of the picture the easiest ways of tamping down or eliminating challenges to your monopoly.

I think that has been a really important thing in U.S. tech since AT&T, since IBM, since Microsoft: keeping the main dominant market players insecure and forcing them to succeed, to improve themselves, as opposed to buying off their competitors or excluding them. So that’s my third.

Klein: So before we wrap here, I want to return to something we’ve sort of been circling, which is: What kind of competition do we want to be encouraging among these platforms?

Tim, one thing you said earlier was that there can be this difference between healthy competition and toxic competition, which, if you read a lot of economic commentary from the early 20th century, you hear a lot about that. I feel like we don’t talk about it that much anymore.

But this is a place where I’ve been skeptical of the argument that many problems would be solved by breaking up the big, particularly attentional, social media and algorithmic media giants. I don’t think Instagram has gotten better under pressure from TikTok. I don’t think that more ferocious innovation and entrepreneurial work to capture my attention or my children’s attention is necessarily good.

Maybe the problem isn’t that we’re not unleashing competition. Maybe the problem is that the entire thing that the companies are trying to do, whether there are two of them or 50 of them, is negative.

Wu: It’s a really good point and a good question. I think in the markets you’re talking about, we have a serious failure to wall off, discourage, ban or ethically consider wrongful the most toxic ways of making money.

So there is such a thing as healthy attentional competition, like making a great movie, that keeps the audience enraptured for two hours. Producing a great podcast — that is good attentional competition. And frankly, the attentional market includes all these forms, but we have just allowed the flourishing of negative models.

I think if you had a world in which you had many more limits on what counted and what was, frankly, legal in terms of manipulating your devices, you would see more positive competition if you broke up some of these companies. I just think the entire marketplace of social media is cursed by the fact that we haven’t gotten rid of the most brutal, toxic and damaging business models for our country and for our children and for individuals.

Klein: I think that is a nice place to end. So always our final question: What are three books you’d recommend to the audience? And Tim, why don’t we begin with you?

Wu: Sure. I’d start with E.F. Schumacher’s “Small Is Beautiful: Economics As If People Mattered.” And I say that because it targets this question of what kind of world we want to live in. I think our efficiency obsession is taking us in one direction, and I think we should choose a different direction.

A second book is more recent. Cass Sunstein wrote a book on manipulation that I think is underrated and is really good for understanding what we have allowed to happen. It’s called “Manipulation: What It Is, Why It’s Bad, What to Do About It.”

The last book — I guess this is where I got some ideas about tech platforms and the big picture — is Paul Kennedy’s “The Rise and Fall of the Great Powers.” I feel everything is on a cycle and every empire has its destiny, its golden age, its decline, its stagnation and fall. And I feel like understanding imperial dynamics is very important to understanding the technological empires of our time.

Klein: Cory?

Doctorow: So my first pick is Sarah Wynn-Williams’s book, “Careless People.” And it’s a great example of the Streisand effect: that when a company tries to suppress something, it brings in interest. So Wynn-Williams was a minor diplomat in the New Zealand diplomatic corps. She became quite interested in how Facebook could be a player geopolitically. She started to sort of nudge them to give her a job as an international governmental relations person. No one was very interested in it, but she just sort of kept at it until she got her dream job. And then the dream turned into a nightmare.

My second choice is a book by Bridget Read. It’s called “Little Bosses Everywhere,” and it’s an argument that the American pyramid scheme is the center of our current rot. And everywhere you look in the MAGA movement, you find people who have been predated upon by the kinds of scams that are characteristic of this and who’ve adopted the kind of toxic positivity that comes with it. It is an incredibly illuminating, beautifully researched book.

And then the final book is a kids book by my favorite kids book author ever, this guy called Daniel Pinkwater. Last year he had a book out from Tachyon Publications called “Jules, Penny & the Rooster.” Recapping the plot of this book would take 10 minutes because it is so gonzo and weird, but suffice it to say, it revolves around a young woman and a talking prize dog who find a haunted woods nearby where the young woman is welcomed by a sort of “Beauty and the Beast” story as a kind of savior, but who wants no part of it. It’s funny, it’s madcap, it’s full of heart. It is everything great about a kids’ book. I read so many Daniel Pinkwater books to my daughter when she was little. They’re so fun to read at bedtime. It’s a middle grades book, and I cannot recommend it highly enough. “Jules, Penny & the Rooster” by the incredible Daniel Pinkwater.

Klein: Cory Doctorow and Tim Wu, thank you very much.

Doctorow: Thank you.

Wu: Thanks, Ezra.

You can listen to this conversation by following “The Ezra Klein Show” on the NYTimes app, Apple, Spotify, Amazon Music, YouTube, iHeartRadio or wherever you get your podcasts. View a list of book recommendations from our guests here.

This episode of “The Ezra Klein Show” was produced by Annie Galvin. Fact-checking by Will Peischel. Our senior engineer is Jeff Geld, with additional mixing by Aman Sahota. Our executive producer is Claire Gordon. The show’s production team also includes Marie Cascione, Rollin Hu, Kristin Lin, Emma Kehlbeck, Jack McCordick, Michelle Harris, Marina King and Jan Kobal. Original music by Pat McCusker. Audience strategy by Kristina Samulewski and Shannon Busta. The director of New York Times Opinion Audio is Annie-Rose Strasser. And special thanks to Natasha Scott. Transcript editing by Andrea Gutierrez and Marlaine Glicksman.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].

Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.

The post Everything Wrong With the Internet and How to Fix It appeared first on New York Times.

‘Are you this stupid?’ Morning Joe floored by Trump’s accidental ‘gift’ to Democrats
News

‘Are you this stupid?’ Morning Joe floored by Trump’s accidental ‘gift’ to Democrats

by Raw Story
February 6, 2026

President Donald Trump made several moves this week that left Joe Scarborough floored Friday morning at what he described as ...

Read more
News

The case for keeping your garden dark at night

February 6, 2026
News

Epstein Victim’s Family Wants to Know Why Trump Is Triggered

February 6, 2026
News

Trump Takes a Blowtorch to International Visitor Numbers

February 6, 2026
News

I was laid off 1.5 years ago and still can’t find a full-time job. I feel like I’m working harder than ever, yet making less than I did before.

February 6, 2026
Your stocks are slumping, but you probably can’t blame Trump

Your stocks are slumping, but you probably can’t blame Trump

February 6, 2026
All 19 NY House Dems — including AOC — back Kathy Hochul for re-election

All 19 NY House Dems — including AOC — back Kathy Hochul for re-election

February 6, 2026
Savannah Guthrie’s Brother Renews Plea for Their Missing Mother’s Return

Savannah Guthrie’s Brother Renews Plea for Their Missing Mother’s Return

February 6, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026