DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

Galaxy Brain: The Internet Is a Misery Machine

November 14, 2025
in News
Galaxy Brain: The Internet Is a Misery Machine

Subscribe here: Apple Podcasts | Spotify | YouTube

In this inaugural episode of Galaxy Brain, Charlie Warzel examines the state of the internet as it stands now in November 2025 with Hank Green, a true citizen of the internet—somebody who has made a living riding the algorithmic waves of the social web. Green started his YouTube channel, Vlogbrothers, with his brother, John, back in 2007, and they now have more than 4 million subscribers. Hank is a creator—and not just in the modern sense of the word. He’s an entrepreneur, an educator, a social-media celebrity, and somebody who understands how to build trust and massive audiences online. He’s deeply attuned to the ways that the technological tools we use begin to change us.

In this episode, Warzel and Green look back on a time when the internet felt small, more serendipitous, and inspiring, and try to tease apart what went wrong. Are people starting to leave TikTok? How exactly did the internet turn into a misery machine? What makes a great headline? Why is it easier now for some people to trust creators over institutions? Green helps make sense of the internet we live on and offers his reasons for why it might get worse before it gets better (but it could get better!).

The following is a transcript of the episode:

Hank Green: Like, the thing that really gives me hope is watching teenagers think that what I do is so goddamn cringe, and I’m like: Yes, I’m gonna do it more, so that you think it’s more cringe and you never do what I’ve done with my life. Stay away from this box.

Charlie Warzel: Stay away.

Green: Stay away from the misery square.

[Music]

Warzel: I’m Charlie Warzel, and welcome to Galaxy Brain. Thank you for joining me here on the ground floor of this project. I am thrilled that you are here. This show is nominally about the internet and attention and the ways that all the tools and the media that we use and consume change us in weird and unexpected ways.

And for a long time, I used to describe the internet as this black box, right, that we piped culture and politics and the economy and society into. And what came out at the other end was the same thing, only slightly misshapen and unpredictably weird. But technology has always just been a cheat code for me.

It is a way for me to tell stories and figure things out about the world. The internet is so firmly a part of every aspect of our lives that basically every story is a technology story. All the stories that I love to tell are about us as humans, how we come together, how we’re manipulated, how we talk to each other, and how these tools change the way that we see ourselves and the way we see our neighbors.

I want Galaxy Brain to explore all of this. I want to delight and obsess over news stories that I can’t stop thinking about because they’re ridiculous or they’re weird. I wanna talk to experts, and I wanna take you into my reporting process. I have so many great conversations in my work, and I want them to show up here. My hope is to have them in public.

I wanna learn, and I wanna think out loud with all of you. I wanna make sense of big news stories, but also I just wanna bask in the absurdity of the internet and go down weird rabbit holes. Being online too much can make me feel insane, but what I love about it is that the internet seamlessly blends high and low culture in our feeds.

When something online feels good, it’s because it’s pairing a bit of informational chaos with this feeling of connection and also this sense of knowledge-seeking. I want the show to feel that way, and that’s why I asked Hank Green to be my first guest. Hank is, as one of his social-media bios suggests, a long-time internet guy, but that’s underselling it completely.

Along with his brother John, Hank started one of the earlier successful YouTube channels all the way back before the iPhone, in 2007. He went on to found Complexly, which is an educational-media company. Another of Hank’s bios says, I might have taught you biology, and that isn’t underselling it. It is virtually impossible to list all the things that Hank has done. This summer, he and a friend were messing around with ideas and came up with an app to help people focus. It quickly became the No. 1 free app on the App Store.

There’s a website that you can go to that counts how many days it’s been since Hank started a new project. He is one of the original creators, but he also embodies the creative spirit of the internet. And so I wanted to have Hank here for what I’m calling a State of the Union of the internet in 2025.

This is a look at where things stand. It’s a conversation that’s meant to level-set this podcast going forward. We talked about how the internet has become a misery machine, how institutions have lost trust, how maybe they can begin to win it back. It’s a conversation that touches on this universal frustration of being online and knowing that you’re being manipulated by all these algorithms—and what, if anything, we can do to push back.

Hank is honest, he’s hopeful, he’s funny, and he’s real, which is why he’s the perfect person to kick off this project. Here’s my conversation with Hank Green.

Green: Charlie Warzel, thank you for joining me on the Galaxy Brain podcast.

Warzel: Thank you for doing an introduction for me. Frankly, you know, I’m new at this thing, and you’ve done it once or twice before. So I just, I felt like maybe that would be a good thing. In fact, we may just give you this whole thing if you want it.

Would you like a podcast for The Atlantic?

Green: I’ll ask you some questions. I bet I could come up with something.

Warzel: Good. Good. This is a total panic mode. Minute one. We’re just, like, throwing the whole thing out the window. Really, thank you. Thank you so much. I think you may be sort of the perfect person to kick a lot of this off. Because you are like truly a person of the internet in ways that are, I think, unique.

But I wanted to start really quickly out the gate, and ask you to think back to a time—it can be yesterday, it could be 1998, it can be any time—that you consider, like when you close your eyes, picture a golden age of the internet. Not objectively—your kind of golden age. Your golden moment.

Like, what is it? What pops into your head?

Green: 2012.

Warzel: Why?

Green: Uh, it’s everything until Gamergate was great.

[Laughter.]

Green: I don’t know. There was like; it was the moment. So I, as a YouTuber, I’m primarily a YouTuber. I think that I’ve moved all around and had, like, places where I’ve had a lot of fun that aren’t YouTube. But ultimately, YouTube has been a very stable place. And that’s where we started as creators.

I mean, I did actually start as an internet creator before YouTube, but YouTube was the place where I had actual success. And there was just this, like, time when there was like, everybody was friends. It wasn’t a very big community. It was big. Like, it felt like it mattered—but to the people who were there, but not to everyone else.

And that’s always the best time on a new platform. Like, when everybody there realizes this is something special. And we’re all here together experiencing this special thing, but nobody outside of the thing knows that it’s special yet. And nobody has ever had the dream of becoming—like, at that point, nobody on YouTube had ever dreamt of becoming a “YouTuber.”

And so all of these people, you know, they had other dreams. But then they were like, What if, like, this is great, though?

Warzel: I always think about around that same time on Twitter, it really felt a lot like—because that was sort of the place where I had that experience. And it felt so much like walking into—like every day, opening up the app on your phone, just felt like walking into a lunchroom, right?

Like, it was contained in that way. It was obviously bigger than that. There were a million lunchrooms everywhere. But my lunchroom was like—these people. It was like, I’m gonna check in on these people. And some of these people are, you know, dicks. And some of these people are, like, my best friends. And some of these people are just the—and you just kind of go to the different tables. And it was like, Okay, you know, I walked through the lunchroom; time to go to whatever class.

Green: Time to actually go to work. Yeah.

Warzel: Time to go to work. Okay, so with that as a little bit of a baseline, in like a sentence or two—like, same sort of thing, closing your eyes—when you feel the internet now, like, when you conceptualize it, what comes to mind? Internet 2025.

Green: Weight. W-E-I-G-H-T. Like, just weight.

It feels so heavy. It feels, it—we have a government run by the discourse, and so it feels like the discourse matters so much. It feels like everything matters a lot. And I don’t know that it does, but I don’t know that it doesn’t. It just feels—it all feels very heavy. It’s very hard. You know, you can go have a fun time in some corners, but it’s always sort of tinged with the weight.

Warzel: I think so. And I think there’s also this feeling, too, that the big part of that, I think, is made by outside forces. Like, life in 2025, and all the things that are happening that are heavy. But I also think that there’s this—the bigness of all of it. Like, the size and scope and scale of the internet right now is also this way where I think so many of the problems that we have boil down to this notion of: Nobody knows what anyone’s experience is like, right?

Like, what anyone else is doing in this place. And I think that that causes just a lot of like agita, right? Like, even in our conception of politics, and this candidate, and what are they thinking? What do working-class people think, and what do, you know, the elites think? And it’s like we’re all shadowboxing this idea of it, because we don’t know what anyone else is consuming.

Green: I think that sometimes people can like be legitimately shocked by, like, a majoritarian viewpoint. Like, they’ll hear somebody say something that’s pretty much right down the middle of the road, average-American stuff. And they’ll be like, Whoa, people believe that still? And it’s like: You’re in a little corner, and you think it’s everybody. There’s so much of that going on. And the other thing that I’ll say—I wanna hit you with this, and I’m curious what you think. I think that as these things start to feel more important, the stuff that succeeds on them, and of course algorithmically, sort of amplified, recommended platforms—which they all are now—they’re all about like, what keeps people engaged. What keeps people watching, or, you know, clicking or responding or whatever.

The things that are sending those signals to the algorithms become more and more the things that, like, feel very important. And so instead of, you know, TikTok being a place where, Wow, that was a cool couple of twins doing a dance, it’s like, I just found out about this thing that the news media isn’t covering that is a very, very big deal. It is going to impact, you know, you or the least powerful in society. Or is in some way morally stained, is some way outrageous or reprehensible. And the thing that gets selected for goes from being, like, a thing that makes you feel good to a thing that makes you feel bad.

And that happened. And that’s like—almost, it seems to be impossible to not have happen on an algorithmically amplified platform. And the more algorithmically amplified it is, the harder it is for that to not happen to it. I think that Instagram actually intentionally puts the brakes on this. I think that YouTube has some structures that make this happen more slowly, but I think they also might intentionally put brakes on it.

I think that Twitter puts the accelerator on it. X accelerates this for us. But I think that it’s very hard to not have this be a thing. And I think that what it selects for is, like, just some guy. So never like a person with any kind of, like, institutional legitimacy. Just some person reporting on something that is, you know, some combination of hidden, outrageous, disgusting. Makes you feel superior to others, while making you like scared or angry.

And, you know, outrage bait is what we generally call this, but I think that, like, a great deal of the internet is sort of made of outrage bait. But as a creator, what I always think is: We make, as creators, we make what the algorithms want. And the algorithms want what the people will watch. And there’s this like—I work with these algorithms.

I know that I have to get views; I have to compete in this environment. And so I definitely, like, I have to fight hard to not do that, even as a person who already has a lot of built-up audience and legitimacy, and like all of that stuff that I have. So I kind of totally understand newer creators who indulge in this, because it’s the only way to get attention.

Warzel: Yeah. I feel like it is so difficult. This happens sometimes, I find, when I’m trying at The Atlantic here to create headlines, right? So you spend all this time going through this process of making something, right? And it’s the research, the reporting, the crafting, the editing, the making everything into what you think is like the most responsible, compelling, interesting unit of information.

And then you realize you have this problem of like, Oh crap, it has to travel. Right? And then,, how do I do that? And there’s this, a thing that I find very frustrating with like the term clickbait. Like I’ve always been sort of against the term. Only because, of course, every single person who makes something that they think is of value wants people to click on it. Yeah. Like, it is not a bad thing to try to do that. But the process I have felt over time, and especially, like, especially given the different platforms and stuff, I feel like the YouTubers almost have an advantage here, right? Because it is something like, you have the auto-play part of the algorithm; you have the recommendation situation built in. But especially on the text-based social networks, it’s like, okay, how—like getting that moment where you come up with the headline.

Green: I’ll give you an exact example of this. So, this Tuesday, my brother posted a video on a YouTube channel that was a—he went to the Philippines to like look at how they are doing tuberculosis screening, to try and figure out how to lower the burden of tuberculosis. Maybe on this one island, eradicate tuberculosis from the island—which is like, it’s a possible thing to do.

And he posted the video, and he had three different titles. All of them underperformed tremendously. And I think the best-performing one was—“We Know How to End TB; They Are Actually Doing It,” was the title. And then, and this video was on track to be our worst-performing video in the last two or three months.

Because, and like, it’s about the most important thing. You know, it’s, like, a really easy way to—and tuberculosis is the most deadly infectious disease in the world. This would be a really easy way to have that not be the case. John texted me, and he was like, “You’re better at titles than me. Help me. It’s fail. Like, nobody’s watching this video.” And you wanna know what the title is now?

Warzel: Please.

Green: “Elon Canceled This. We Are Doing It Anyway.”

Warzel: Yeah, yeah.

Green: It has to—

Warzel: Mr. Beast jumped this train into a canyon and, uh, tuberculosis.

Green: It has to tie into some like narrative that we already have, right? Like, some narrative that we are already caring about. And like, you know, I am, I think, legitimately outraged about the gutting of USAID. And that was, like, one of the most morally reprehensible things I can imagine: for the richest man in the world to cancel aid to the poorest people in the world. And for that to just happen in front of all of our eyes, and we have to keep going on with our lives.

Warzel: Absolutely. And so, that is perfect.

Green: It makes sense. It makes sense that that title would work. But, like, that’s not what I wanna say. What I wanna say is, all that should matter is: We’re trying to eradicate tuberculosis. And let’s just have that be a positive message. And like, that’s the game we’re all always playing, all of the time. But what did I do? I made it about something bad.

Warzel: Right? Right. Yeah. And that just, that affects—

Green: Everything. Everything. I just think it makes us miserable.

I think that like what we’re living in right now is the world according to misery, and that you get Trump in that world. Of course you do. Like if misery is in charge, then you get, like, a conductor of the misery symphony,

Warzel: Right? This was a little bit of—I, I watched this video from Zohran Mamdani’s campaign the other day, sort of like the last-push kind of thing.

And it’s just the thing that I’ve noticed in all those videos. He gets this, you know, a lot of people talking about, like, “He’s so good at the internet,” right? And like, I think that he is. And he’s charismatic and all that stuff. But he’s also just like found this great way to, like, invert that; flip that. Like, it’s all about—what makes it all stand out to me as someone is, like, Oh wow, here’s one guy who sees campaigning and politics as a joyful thing. Yeah. Like it is not—

Green: He seems happy all the time.

Warzel: It is so outside of the misery machine, right? Like, he just walks around New York, like dapping people up. Like, that’s not who he is like; you know, that’s a character in some sense that he’s playing, too. But it’s so different that it’s almost like—it feels almost alien on this internet.

Green: And it gives me a little bit of hope. Like, the thing that really gives me hope is watching teenagers think that what I do is so goddamn cringe, and I’m like: Yes, I’m gonna do it more, so that you think it’s more cringe, and you never do what I’ve done with my life. Stay away from this box.

Warzel: Stay away.

Green: Stay away from the misery square.

Warzel: I love that, that you, I love that. I think that, yeah, we need to—I think there needs to be a complete change, like cringes.

Green: Yeah.

Warzel: We have to take back cringe. Cringe has to be—we have to take it back. I think personally.

Green: Well, something’s always going to be cringe. The thing we need to take back is, like, we can’t let,trying too hard—like that’s, of course, always the cringiest thing is appearing to try very hard. But I do think that a lot of young people now are like—you know, it’s, it’s funny that like Jonathan Haidt’s out here being like, We must protect our children from the internet. And the children are like, The internet’s kind of lame. It like happens always exactly at the same time. Like, the kids have their own backlash against it. And they’re like, “Actually, we shouldn’t try to make this look forbidden and cool.”

Warzel: Exactly. No, exactly.

Green: But for clarity, we should keep them out of schools.

Warzel: The phones.

Green: Yes.

Warzel: Or the children. Maybe the children too. Just keep everyone out of school.

Green: Well, you know, we tried that for a year or so, and that was great. No problems at all.

Warzel: Oh, well see, you know, actually. But I am sort of curious in terms of thinking about misery and the internet, and also this idea of like, you know, what it’s doing to us.

You had a video. I don’t know, a couple months ago, I think in the summer. The title of it was, “You’re Not Addicted to Content, You’re Starving for Information.” And like, the genesis of that video, when you’re talking about it, was trying to come up with the right analogy for sort of our information, right?

Or like social-media ecosystem. Correct me if I’m wrong here, but you were grappling with the idea of like, you know, cigarettes is a nice easy frame, but it’s probably not that. And it seemed like in that video you sort of got a little closer to this idea of hyper-processed foods, right?

This, like, “You’re constantly eating, but you’re never satiated” kind of thing. Is that, I mean, is that separate in your mind from this issue of misery, or is that really the same thing that the hyper-processed stuff is like the “misery content”—and what we actually want is something nourishing?

Green: Yeah; I think they’re not the exact same thing, but they’re very related. You know. If I mean a little bit, the sort of hypothetical to ask is what would happen if you, like—so step one: Your reality is created by what you pay attention to. So that’s step one.

And I think that that’s kind of philosophically true. No. 2: What would happen if you gave the smartest people in the world the most powerful tools in the world for controlling people’s attention? And then you optimized that for making them never look away. And that’s very similar to asking the question: What if a big food company was like, “What is the way in which we will take the ingredients of food and make something that people will never stop eating?” And you get a Cool Ranch Dorito, or you get a Big Mac and fries with a Coke. I mean, god bless a Big Mac and fries with a Coke. It’s very good. I understand why I would eat a lot of it. Sometimes people are like, “Hyper-palatable food is so gross.” And I’m like, no, it’s like, in the name. It’s so good. I’m glad that you’ve got your brain in a place where you think it’s gross, but man, I want it now. I want a walking taco right now. Um, which is—I don’t know, do they have those other places? Charlie?

Warzel: Walking taco? No, I don’t think so.

Green: No, that’s, is that like a Mountain West thing?

Warzel: I think it is. I think it is the greatest time zone in the world, by the way.

Green: Okay.

Warzel: Absolutely. Perfect time zone.

Green: Yeah. Oh yeah.

Warzel: Somehow adding one hour makes working with the East Coast—the bully time zone—makes it nearly impossible, somehow, working in the mountain. Yeah. It’s like you’re basically like, you live in New York or D.C. and—

Green: Then you just gotta, you just gotta pretend. Don’t even tell people where you live.

Warzel: It’s awful.

Green: And like, in the same way that you might end up with like, some foods that are very palatable because they are salty and some because they’re very sweet and some because they’re very fatty, I think that you get a variety of kinds of hyper-palatable information. And one of them is the sort of misery-inducing outrage bait.

Warzel: So how do you combat this? Like, in some senses, right? Like, because you are a creator in a traditional sense. But you’re also an educator of millions of people all the time. Yeah.

Green: In a nontraditional sense.

Warzel: But like, you are creating. I was talking to one of our colleagues, and I was like, “I’m gonna talk to Hank. Do you know of him?” And it was basically like, “Yeah, like I was raised by him.” Like, your bio, you know, is like, “I probably taught you biology” or whatnot, right? But like, I think so much about the way that, you know, you say being trained by these algorithms, trying to use the skill for good.

But also there’s that sense that we just talked about, right? With the headlines. It’s like, sometimes you gotta use the dark arts, right? In benevolent ways, and just get people to care. But like, how are you feeling about your ability to balance that? Like, do you feel like you’re, on a day-to-day basis, like you’re winning this? Or do you feel like the forces are slowly kind of crushing?

Green: Like, do you mean personally? Or like in the work, publicly?

Warzel: I mean personally; like in the sense of, do you feel, like personally in your work? I guess so not like, do you feel like those forces are winning? Or do you feel like there are ways like that we can really just keep this at bay?

Green: Um, yeah. No, I think that the forces are winning. But I think that it’s like a long battle. There’s no doubt in, well, if we still exist and everything, there’s no doubt in my mind that there will be a future where we will all look back on, you know, the Twitter era as very similar to the yellow-journalism era.

And like, we will have developed a bunch of new norms and taboos, and certain behaviors will feel very cringe. And certain people will be like, “Oh my God, I can’t believe, like, people listened to the things that strangers said to them on Twitter, and that was how reality was created.”

I think that we’re still in the “It’s getting worse” part, though. There’s lots of signals that, like, I feel like it’s getting worse less quickly.

Warzel: How so?

Green: I think that the fracturing of Twitter, as much as it hurt me. I really, I loved Twitter.

Warzel: Same.

Green: I think that the fracturing of Twitter has been, just sort of like—a lot of people went to Bluesky. A lot of people went to Threads. A lot of people just don’t do it anymore.

I think that, like, that’s kind of the best outcome. And Twitter itself has become much worse. Of course, it’s a very difficult place to go if you’re not a racist. And I think that it’s probably not ineffective at turning people who are mostly not racist to, like, more racist than they used to be.

So, I think it’s a bad place to hang out. And I think that Elon is very aboveboard about the fact that he would like to use both the algorithms of promotion and the sort of effects of large language models to change people’s reality. Like he doesn’t shy away from saying. Most of them, I think, do know that they’re doing that, and do kind of like that. They’re doing it a little bit, but they don’t say it. But Elon says it. You might think that with all the sex bots that it would actually make people have less sex. But no, we’re gonna turn the sex dial up, and people are gonna make more babies.

That’s he—he says these things.

Warzel: Right. Also, the, the notion, too, of like creating his own, you know, Wikipedia. Grokipedia, yeah. And the idea of like, it’s—“The facts aren’t really fitting the way that I view the world.”

Green: Yeah.

Warzel: “So we would like to basically take Wikipedia, take the skeleton, strip it, and then inject our reality.”

And it’s like, I do get what you’re saying. There is something minusculely refreshing about like: Okay, well at least it’s like right there. At least we know what the political project is.

Green: And what it feels like is—it’s starting to seem just kind of gross and unproductive. And like, I see it in my physical world, where people are like, “Oh, poor Hank; he has to be on the internet.”

And I’ve seen a lot of people. I’d really like to see TikTok’s numbers, but I feel like people are leaving. I feel like the numbers on TikTok are a lot lower than they used to be. I feel like people have moved more toward Instagram, which is very intentionally kind of more on “America’s funniest home videos,” and to longer-form content.

And there’s something I think, really—as a person who also loved TikTok in its heyday and participated very actively there—I think that there’s something very bad about giving away all of your decision making to the algorithm. So like, you never know the choices that you’re making. You know you are choosing to keep watching, and that’s the big signal that it gets.

But you don’t know you’re making that choice. Whereas on YouTube, you go, and you, like, pick out which videos you wanna watch, based on who you like, what thumbnails are appealing, what titles seem interesting.

Warzel: It really does feel, in that way, so much more like going to like the video store, right?

Like yeah, going back to ’90s or whatever, right? Like, you do that thing. I pull it up; like it blows my mind too: how, more than any other platform, I think the more I put into YouTube, the better my experience feels. You know, I mean, not always a hundred percent of the time. But it’s like, I open that thing up now, and I feel a little bit more like, okay: This thing kind of knows me, and it knows here’s my guilty-pleasure section; here’s my, you know, like I’m being a dude watching sports stuff. Here’s my podcasts. You know, all that kind of stuff. Whereas like with TikTok, it feels, even when it’s good, like being waterboarded, right?

Green: It’s just, yeah. I mean, it’s there. The hardest time on TikTok for me was when I was going through cancer treatment. And I would get, like … I had to stop using TikTok. Because I would never know when I was gonna get hit with illness content. Because, of course, I had started to selectively watch that kind of content longer.

And I’d just be scrolling, and then it’d be like a person in the hospital dying of cancer. And I’m like, Well, this app is no longer useful to me, because of course I’m gonna watch that. But like, I shouldn’t be—like, that’s not helping me.

Warzel: This also, this makes me think of too, is the way that what you just said about, you know, younger people sort of being like, “Yo, internet, uh.” You know, we’ve had a lot of that. Like the thank you. I think some of this, too, is just like the only sort of truism I think about the internet and like, younger, generations who’ve just grown up totally immersed in it—is they just have such an innate understanding of the way that they’re being manipulated by these things.

And I think, more so than anyone else, they’re a little bit like—as soon as they can feel that, like heavily, there’s this real, “Oh, come on man.” Like “You know, like, I’m checking out for a moment.” Not forever, obviously.

Green: But I think there has to be a real pushback against manipulation, which is a lot of what the algorithms are about, and a lot of what the algorithms inspire creators to do.

You know, on TikTok, it’s really important that you keep people watching for the first five seconds. And so you wanna have them kind of be confused for the first five seconds, so that they’re like, “Wait, I don’t know.” Because if they think they know what’s coming, then they’ll swipe away.

They’ll be like, “Oh, I know what’s coming.” That, you know, reward has already occurred now. And like, but eventually that starts to feel really manipulative, because people get what you’re doing. You’re intentionally keeping them confused. And I think that, like, we need to have a backlash against manipulation.

I also think that we have to have a backlash against—I think we hopefully will have some kind of backlash against outrage, which is like the ultimate form of manipulation. And like, it can be used for good. I’m not saying it can’t be, and it has been. But as we record this, the government is still shut down.

And I think it is in part still shut down because what’s the incentive to open it back up? What’s the incentive for anyone to capitulate, when everybody hates everybody and working together is the thing that loses you the primary? You know, like it’s the thing that’s selected against. And so we need some kind of like awakening.

That’s sort of: “Okay. So it turns out when government doesn’t work at all, that is bad.” It’s bad for everybody. But it’s also bad for the politicians, and so, what are the mechanisms at work here? And I think one of the mechanisms is that there’s like four guys in charge of defining everybody’s reality, and the only thing they’re optimizing for three of them is profit.

And then the last one is optimizing for “Believe the same things Elon Musk believes.” Which—his entire worldview was defined by algorithms optimized for profit. So he is like the meta-profit. The meta-algorithm has happened inside of Elon Musk’s brain.

Warzel: Yeah, no, I mean—that’s such a great way to describe this sort of period where it feels like we are. Like a lot of the people in Silicon Valley have been radicalized by their own products, and now they’re exerting just an extreme amount of authority.

But what they are parroting is the reality built by the algorithms, like the ideas. It is this very odd thing. It’s like, finally, like everyone’s getting radicalized or changed by these platforms in weird ways—but it’s like it ended up happening not only to the people who built the project or the tool. But also like the people—the tool made them so much money they can take that money, turn it into influence, turn the influence into policy, or reality, or whatever. And it’s like, oh my lord, like this cycle just built itself into, yeah, into reality. I think that’s a very good way to be thinking about it. And also very sad way.

Green: A very, yeah. No, I don’t like it.

I hadn’t said it that way before.

Warzel: Well, that’s what podcasts are about. Okay. Yeah. Figuring out what it is you have to say; it’s like therapy that way. So I wanna talk though a little bit about, with all of this in this maw, and as someone who is an effective communicator and educator, as we just talked about, an effective creator, someone with lots of audience in lots of different places.

After the election, I had a phone call with you in which I was like, trying to make sense of some of the stuff, especially in, like, the media ecosystem. And this idea, this long, tortured conversation on my end, about trust, right? And this idea that institutions have lost tons of trust, and in some ways the parts of the institutions that are meant to build them, right?

Like fact-checking, editing, all these things, are actually weaponized often against them. And it’s this notion that, like, well, “Why did they edit that? What are they hiding?” Even though it’s just like a quality-control mechanism. Meanwhile, creators—many of whom are doing great work, some who are, you know, doing less great work—but mostly like working very independently, sometimes flying by the seat of their pants.

Again, this is not to demean anything in that space. But just people who are sort of more on their own, not part of these institutions, yeah, have built up so much of this trust. Like they have kind of collected the rest of that trust. They have it. And it’s this very odd—understandable in many ways, but—this very odd balance.

And so you’re somebody who, I mean, you have built up tons of trust with audiences. And I’m curious: What do you think about this interesting inversion of trust in the world as it is right now?

Green: I think it’s bad. On the whole, I think that there’s lots of reasons why groups of people working together are better than people working alone.

I say this as a person who’s done it both ways. But I also know that there’s—at the moment, people really like it when it’s a person doing it on their own. And I have thought a lot about what it is that has made that happen and what’s driving that. And also I think it’s important to recognize that people are also this way with LLMs. So it’s not just creators. You also can go on to ChatGPT, and you can ask it a question. And it’s easier to trust ChatGPT than it is to trust a news-media organization, which is also very interesting. I think that it is a, not so much—I had a fight with Nilay Patel about this, actually, I’m remembering.

Warzel: All right—let’s go.

Green: Um, he—

Warzel: Airs dirty laundry.

Green: Well, Nilay is. This was public, so don’t think that it’s—it’s not a secret fight. And it’s a productive fight. We are very good pals. His take was, the platforms want you to believe this, but it’s not actually true. And I think that’s wrong.

Warzel: The platforms want you to believe which thing?

Green: That individuals are more popular and more trustworthy than—that people trust individuals more than they trust institutions now. And in fact, the platforms are just like, “Yes, yes, please believe that.” Because it’s much easier to extract value out of individuals than out of institutions.

Institutions will argue with you about what you could and like—whereas I’m like, Are you gonna gimme attention? Come on, give it, I’ll do anything. I’ll do anything for a little, just a little bit more. And especially like, you know, there’s always somebody to come and replace whoever burns out.

You know, there’s always somebody else who will do it for not very much money or just for the clout. So I think that it is part of the misery machine. This is my take. It is part of the misery machine, and you have to ask if you are going to make content that makes people outraged.

Who is the villain? Who are good villains? And news media is good villains, and I see all the time, people will say, “The news isn’t covering this story.” So there was that flood in Alaska, and I saw a bunch of TikToks about how the news media wasn’t covering the flood in Alaska.

And they were from people who had read news stories about it. And the reality is that nobody was paying attention to the floods in Alaska. But if you lead in with “The news media isn’t covering this,” then people do start to pay attention. So they weren’t—like, it wasn’t that big of a deal to them.

It didn’t really matter to them that there were a bunch of Indigenous communities in Alaska that had been wiped out by these floods. But then it kind of did matter when it turned out there was a big power structure at play. And that was fitting into a narrative of big, powerful institutions ignoring marginalized people. Or big powerful, or,, this sort of Trump “fake news” stuff.

Big, powerful institutions doing these un-American, dangerous things. And so you can come at it from either direction, and the news media is always a good target. And it’s always a good path into getting views and, like, amping up the stakes, amping up the outrage. Making it seem like it’s somebody’s fault. And it’s not your fault.

So whose fault is it? “It’s the fault of the person who wrote the publication where I read the story.” Um, like, but that doesn’t count, because like, you know. How does, who determines where the story goes on the front page? Ultimately, you gotta understand that’s determined by what people are interested in. And the newspaper will do everything in its power to have the most interesting story be the one at the top, and like they’re trying to suss that out all the time.

That’s the whole goddamn point of the editorial team—to try and figure out which stories are interesting enough to cover. But anyway, regardless, that’s just one sort of specific example. I think that the news media is just a particularly good target. I think that institutions are good targets, because it seems like they should be perfect. At any time, you can find a way that they’re imperfect, or tell a story about how they are, even if they didn’t make a mistake. Which is also fine. Then that’s just another sort of, like, puzzle piece in the misery puzzle. I think institutions are just easy to attack. And that goes for the CDC, it goes for academia, it goes for the news media, it goes for the federal government. It goes for everything.

Warzel: It’s very—there’s so much there. I think so much, too, about, like, I mean, there is a frustration when you’re in the media. Obviously with someone’s, you know, “They’re not covering this,” and so much so. Though that like, you can kind of push that aside.

But I think a big, like a really big, thing that people don’t talk about enough is this notion that—and people get very mad when you say this as someone in the media—but that audiences, like you said, are in control in ways that they don’t understand. And it’s not that publications don’t have their own, like, ideas about, like, “Hey, this is what we’re gonna put, you know, front and center.” But like, there is this feeling all the time. And you see it a lot with the way that people, like when people talk about the news they want covered, right, or the things that they want, versus the things they click on. It’s very different, right?

Like, people just want—we all have our base understanding of our base attention structures in our heads. And, like, we click on the things that are interesting. That, you know, that alarm us, that shock us, that surprise us, that confirm our beliefs. All those things. And often those stories, those things—you know, climate change is an awesome example of this.

Like, it’s hard to get people to care about climate stories in certain ways, despite the fact that everyone cares about the planet, right? And I think it’s very interesting. So in that way, only in the way that people, again—this is like going back to headlines, right? That people want other people to care about what they’re doing.

There’s this way in which so many people in the audience are in control in ways that they don’t understand. Like it is not, I think, a coincidence that around, you know, this birth of the social internet in like 2009, 2010, all newsrooms got these metrics like Chartbeat and the things where you can see what people are doing.

I don’t think it’s like trust in media, you know, declines relatively steadily. But I think that it’s not a coincidence that once we realized exactly what people are seeing and thinking and clicking on and everything, that trust, you know, didn’t go up. And I think that’s a little because—

Green: Well, do you think that it’s not a coincidence that that trust went down? Do you think that, like, as people were given more of what they actually not wanted, but were more likely to click on, that that degraded trust some? ’Cause that’s not really how I see this, but—

Warzel: I have that—

Green: Piece in.

Warzel: I have this theory. I don’t think it’s neat, right? But I have this one theory that at least what’s happening now is that there is a audience fatigue of being given what you want. Like, in this sense, I think a lot, you know, there’s this sort of apocryphal quote from like Rick Rubin. But all the creative people say this thing, which is basically just, you know, it’s like: What people want is, like, your taste, right?

Like, the thing that they want is for you to not crowdsource and poll-test. What is the creative thing? They want the Red Hot Chili Peppers or whatever to be like, “This is the sound,” right? Like, no, this is the thing. “This is who we are, this is our taste, and hopefully you like it.” And people respond to that sort of like, “I made this for you. Take it or leave it.” Not “You made this for you.”

Green: This is definitely advice I give to creators, which is like: If you only chase the thing that’s getting the most views, people will get tired of you. So you have to be making something—you have to be making some stuff that’s like, “This is gonna get a bunch of viewsand I know it,” and some stuff that’s like, “This is something that I’m interested in, but I think it’s going to maybe do well.” And some stuff that’s just like, “I just wanted you guys to know that I’m a person, right? And so here’s, here’s like some person stuff.” You know?

Warzel: I’ve been thinking a lot, and I know you have too, about AI slop, right? And you’ve spent some time with the Sora 2 feed, right?

Green: I did. I went on there for the worst 30 minutes of my life. Yes.

Warzel: And this sort of deadening, like human-less but human-generated, but not human-made, content thing. Of this sort of, you know, styrofoam packing peanuts of whatever that’s also burning down the Amazon rainforest or whatever.

And I’ve gotten relatively despondent at times about AI slop, right? Just, like, the fact that people just keep doing this and making it, and at the same time everyone kind of thinks it sucks, but everyone is still sort of like, Yeah, whatever.

Like, you know, Shove it down my throat. It’s all good.

Green: It’s very hard to know how many people think it sucks, because like the people who don’t think it, who are like, “Ah, it’s fine,” don’t say anything. And also they might just be like normal people, you know? I always go on Netflix, and I look at the No. 1 show of the week—oh boy.

And I’m like, Well, I have no idea what’s going on. I don’t know what America is, and I don’t know who these people are, ’cause I’ve never heard of that show.

Warzel: Right.

Green: So that’s one thing to always keep in mind. But I think mostly—I don’t know. I think it’s a fad.

Warzel: Yeah. But I also—what it makes me think of, though, is this idea, right? That let’s say “slop” is not just the synthetic photos or whatever. But it’s also companies that get, you know, ChatGPT to write an article or SEO copy. And, you know, create websites that’ll help go up their rank.

But it’s all synthetic stuff. There was a recent study—again, take it or leave it—from an SEO company called Graphite. That was just saying that, like, we’ve passed the whatever event horizon of, like, more than 50 percent of this stuff is AI. Is synthetic in some way. On the internet.

And basically what I’m thinking about with this idea is, like, the notion that maybe it’s almost like a controlled burn of like the internet, right? Like in order for these other things to grow, we’re going. We’re so inundated with this stuff.

Green: The Great Chicago Fire of 2025, except it’s the whole internet.

Warzel: Exactly.

Green: Yeah.

Warzel: And this notion that, you know, eventually there could be this—like maybe, in this hopeful way, there could be this premium on like, “Man, there’s just so [00:56:00] much garbage, right. That like, I do want—I want the opposite of that. Whatever the opposite of that is.” And the opposite of that is much more human made.

You know, whatever stuff. Do you think of it that way, or how do you think about that?

Green: I don’t know. I think that I was so wrong about how the internet went the first time that I don’t trust myself with any—

Warzel: How were you wrong about it?

Green: Well, I thought it was gonna bring us together, Charlie.

Warzel: Oh.

Green: I thought—

Warzel: Joke’s on you.

Green: I thought, We’re all gonna have our voice. We’re all gonna be able to listen, and we’ll all listen to each other. Everybody thought that. It was Creative Commons and Wikipedia. Look at those great things that were from the early era of the internet.

Like, Wikipedia remains one of the greatest creations of humanity. And that, like, that came together very quickly in the early internet. And it’s still, I mean, it’s still very good. But like now, we have all this. Now we have all this. Now we have the discourse, and the discourse is in charge.

So, I don’t know. What I know is that the way that it is right now is not the way that it’s going to be. I know that people, so far, every time you’ve given them a chance to give away choice and agency in how they spend their time, have wanted to take that. Have wanted to take that product. The one where they make fewer choices; the one where they are served.

And that’s the thing I hope that we have the backlash against. Um, I do wanna point out that I think that the charts showing, you know, how much of the internet is artificially generated and how much is created by humans are one way of showing that data.

And if you showed that data as instead of content on the internet indexed by search engines, you showed it as content consumed by people on the internet, that the chart would look very different. And I don’t know how to get that data.

I don’t know what that would look like, but I’m fairly confident that if you looked at what people actually are reading, what they’re looking at, that, you know, still 90-plus percent of it would be created by humans. But I don’t know. Maybe not. I do know that those are two very different numbers, though.

Warzel: Yeah. Well I think that’s a hopeful place to end it, here. I think we both need the revolution.

Green: Yeah.

Warzel: In a sense.

Green: I don’t know what it looks like—

Warzel: It might, it might not be as bad as—

Green: It might look more like outside. Honestly. I think like maybe that’s a revolution at this point.

Maybe the internet will get better, but maybe we will just be on the internet less. I think that that could also be quite good.

Warzel: I think that is actually—I think that’s the direction I wanna push everyone with. Yeah. In all the further episodes, I want you to end the episode by either realizing that you are outside and throwing your phone into the ocean, or throwing your phone into the ocean and going outside.

Green: Is this the first episode of this podcast?

Warzel: It could be. You never know.

Green: Well, I was gonna feel honored, and now I feel like I might be honored.

Warzel: No, no. It is the first episode. You can feel honored.

Green: Oh, well.

Warzel: Now there is one last very small question I have, and it’s, I won’t ask other people this, probably because they’re not gonna be good at it.

Green: What three books you’re reading right now,

Warzel: No, not what are three books? My last question for you is: You need to come up with the YouTube title for this episode.

Green: Ooh. Oh, great question. Yeah. That is my job. That is the thing that I do.

Warzel: I know.

Green: Um, okay. “It’s Gonna Get Worse Before It Gets Better.” That might be a good one.

Warzel: Hank, thank you so much for coming on Galaxy Brain.

That’s it for us here. Thank you again to my guest, Hank Green. If you liked what you saw here, new episodes of Galaxy Brain drop every Friday. You can subscribe to The Atlantic’s YouTube channel, or on Apple or Spotify or wherever you get your podcasts. And if you enjoyed this, remember you can support our work and the work of all the journalists at The Atlantic by subscribing to the publication at TheAtlantic.com/listener.

That’s TheAtlantic.com/listener. Thanks so much. See you on the internet.

The post Galaxy Brain: The Internet Is a Misery Machine appeared first on The Atlantic.

A.I.G. Parts Ways With Incoming President Before He Assumes Role
News

A.I.G. Parts Ways With Incoming President Before He Assumes Role

November 14, 2025

The incoming president of American International Group, one of the world’s largest insurers, abruptly withdrew this week, after The New ...

Read more
News

DOJ Issued Seizure Warrant to Starlink Over Satellite Internet Systems Used at Scam Compound

November 14, 2025
News

Alt-Jesus retelling ‘The Carpenter’s Son’ is worse than blasphemous — it’s boring

November 14, 2025
News

Gavin Newsom’s Climate Warning for American Companies 

November 14, 2025
News

Marriott-Sonder guests are scrambling for refunds — expert explains how to get your money back

November 14, 2025
Ex-cop testifies in own defense about shooting that left unarmed Whittier man paralyzed

Ex-cop testifies in own defense about shooting that left unarmed Whittier man paralyzed

November 14, 2025
‘Oh my gosh, you’re a baby.’ Meet Meila Brewer, UCLA’s 16-year-old soccer star

‘Oh my gosh, you’re a baby.’ Meet Meila Brewer, UCLA’s 16-year-old soccer star

November 14, 2025
F.D.A. Sharply Limits Approval for Drug Linked to Two Teen Deaths

F.D.A. Sharply Limits Approval for Drug Linked to Two Teen Deaths

November 14, 2025

DNYUZ © 2025

No Result
View All Result

DNYUZ © 2025