Subscribe here: Apple Podcasts | Spotify | YouTube | Overcast | Pocket Casts
Cultural attitudes toward porn may be liberalizing, but the belief that minors shouldn’t have unfettered access to it remains broadly shared. Parents are the natural guardians of their children’s internet habits, but many report feeling powerless against the innumerable work-arounds and relentless societal pull toward unrestricted internet use.
So what can be done to prevent kids from accessing harmful content? Make porn websites check ID? That’s exactly what several states have tried—with mixed results.
A new study by researchers at Stanford, NYU, the University of Georgia, and Georgia State followed the implementation of a law in Louisiana that required any website publishing a substantial amount of pornographic content to take reasonable steps to verify the age of users before giving them access. The researchers found that while search traffic to Pornhub—which complied with the law—dropped by 51 percent, traffic to its noncompliant rival, XVideos, rose by 48.1 percent. This is a classic tale of tech regulation: lots of friction while the primary aim remains unfulfilled.
But one of the researchers, Zeve Sanderson, the executive director of NYU’s Center for Social Media and Politics, isn’t resigned to defeat. On today’s episode of Good on Paper, we discuss what governments can even do to regulate the internet on behalf of minors and what doing so might cost the rest of us. Also, he explains, Louisiana’s legislation shows that writing a law can be the beginning, not the end, of a policy process.
“A noncompliant firm that platforms content that we would be more concerned about has risen,” Sanderson laments. “And it’s not clear to me that any laws are gonna change as a result.”
The following is a transcript of the episode:
Jerusalem Demsas: Thirty years ago, one of the only legal ways to access porn was to walk into a store, show some ID, and purchase a magazine or video. Today, the concept is almost laughable. I don’t even think most minors even realize they’re doing something illegal when they search for porn online. When something is trivially easy—like jaywalking or setting off fireworks or finding porn on the internet—it feels legal.
But over the past three years, legislators in nearly half of U.S. states have passed laws to try to end the porn free-for-all. The goal, they say, is to stop kids from viewing adult content, by forcing porn sites to verify the ages of their users. This episode is about how policy can backfire, and raises questions about how governments can even begin regulating what kids do on the internet.
My name’s Jerusalem Demsas. I’m a staff writer at The Atlantic, and this is Good on Paper, a policy show that questions what we really know about popular narratives.
My guest today is Zeve Sanderson. He’s the executive director of the NYU Center for Social Media and Politics, and a research associate at the school’s Center on Technology Policy. In a new study, Zeve and his co-authors find that the effect of these laws are not as policy makers intended. While there was a 51 percent reduction in searches for Pornhub, which complied, there was a nearly commensurate increase in searches for the dominant noncompliant platform, XVideos.
We wanted to give XVideos an opportunity to respond to this story and the claims that they are not complying with U.S. state laws. We requested a comment from the company but have yet to hear back from them.
Let’s dive in. Zeve, welcome to the show.
Zeve Sanderson: Thanks so much for having me.
Demsas: So we’re going to start with a noncontroversial question. Do you think porn is bad for kids?
Sanderson: So it’s a good question. One of the challenges with answering this question and answering many of the questions right now around, sort of, “insert a particular type of digital media,” including social media, and that effect on kids, is that it’s really hard to run, sort of, causal studies on kids.
Demsas: You’re not going to randomly assign kids to watch porn.
Sanderson: Exactly. And which is actually sort of a big issue when it comes to, like, Is porn bad for kids?
Demsas: Yeah.
Sanderson: And so there’s a fair amount of correlational work. I want to flag that my focus is on political communications and on tech policy. So I obviously have a broad understanding of the literature, but I myself am not a psychologist.
As I understand the literature, studies have shown that adolescent exposure to online pornography is associated with things that we would consider to be normatively bad—things like body-image concerns, lower self-esteem, and increased acceptance of sort of aggressive sexual scripts, which may normalize sexual aggression more broadly in intimate contexts.
However, again, these are purely correlative, and so drawing that sort of causal connection is hard, in general, but especially hard, like you said, when we’re not going to run, like, an RCT where we expose kids to pornography.
Demsas: Hey, kids. Do you want to join this fun study? (Laughs.)
Sanderson: Yeah. I can’t imagine any university ethics board that would go along with that.
Demsas: Yeah, that’s like a 1960s-type study. And if they didn’t do it then, we’re not going to get it now.
Sanderson: Right.
Demsas: Okay, but what do you think? Like, you’ve spent a lot of time in this space, right? I get that it’s correlational, but the problem with this kind of research is that we have to make policy even though we don’t have RCTs. So do you think it’s more plausible that porn is making kids less happy, have body-image issues, express more anxiety, have negative interactions or things that we would consider bad social scripts regarding gender relations?
Or do you think it’s mostly, if not all, selection—that the kids who already kind of have those traits are the ones more likely to be using porn or admit to using porn in surveys?
Sanderson: So I think that it broadly depends on contextual factors, right? So kids with access to comprehensive sex education and strong parental communication sort of demonstrate in research a greater ability to critically evaluate pornographic content.
That said, I think one of the interesting things about this policy space is that there’s actually pretty broad acceptance that, like, trying to block access of kids watching porn is a good thing that everyone wants to move towards. Even Pornhub and the Free Speech Coalition, which is the sort of professional lobbying group of the adult-entertainment industry, are all sort of directionally on board with “kids shouldn’t have access to adult content.”
As I’m sure we’ll discuss on this show, though, the challenge becomes: How do you move from there to, like, a set of policy interventions that actually work?
Demsas: Set the stage for us a bit here about porn usage among kids. And to the best of our knowledge, how much usage do we see? How much do we see it differ by gender? What’s the age of first exposure, as best as we can tell?
Sanderson: Yeah. So in terms of what we know, a lot of it, like we discussed, starts from survey-based studies because we’re not going to be treating kids with certain types of adult content. And so surveys suggest that first exposure typically occurs between ages of 11 and 13.
And so I think one of the interesting pieces is that we have always attempted to restrict kids’ access to porn. And before the internet, that largely was done via needing to show an ID at a convenience store or an adult shop in order to get access to the products that they were selling. The internet really complicated things, and policy makers in the ’90s largely tried to solve this and failed.
The main laws that were passed were struck down on First Amendment grounds. And so as a result, what we’ve sort of gotten in this shift from primarily offline access to primarily online access has largely been self-regulation. So if you access an adult website, they have a pop-up that asks you to verify if you’re over 18. In sort of more nascent examples, there was an adult video game called Leisure Suit Larry in the ’80s that would ask trivia questions as an age-verification system.
Demsas: Wait. Like what?
Sanderson: Like “Who is Spiro Agnew?” Nixon’s former VP.
Demsas: (Laughs.) That maybe also shuts out a bunch of adults too, right?
Sanderson: Totally. Yeah, it would be a weird cross section of, like, very precocious kids and adults who have access to it. Another funny one is that one of the questions was, “O. J. Simpson is ______,” and it was a multiple choice, and it was 1987, and one of the wrong answers was “under indictment.” And so there would also be a time feature to who had access to this.
Demsas: That’s so funny. I don’t know if this is an apocryphal story or a real one, but there’s the park ranger trying to design a trash can. And he’s, like, the overlap between the smartest bear and the dumbest human is quite large, so designing trash cans in national parks is difficult.
Sanderson: Exactly.
Demsas: Cool. Okay. I think that sets a stage for us a bit here because we’ve seen in recent years kind of more attention towards how regulators can really engage in this space. The internet’s kind of like the Wild West, and it’s a place where you don’t see a ton of regulation, not because I think there’s not a desire to do so, but people kind of feel like it’s futile, which is maybe a theme of this podcast today.
But my colleague Marc Novicoff wrote a great article in The Atlantic that goes over some kind of personal history here in Louisiana. So Louisiana passes a law to force pornography websites or websites containing, quote, “substantial adult content” to verify their users’ ages.
And Marc writes that it happened, in part, because the Louisiana Republican state representative Laurie Schlegel decided to act. Schlegel is a sex-and-porn-addiction counselor and had heard Billie Eilish describe how porn had affected her as a child. “I started watching porn when I was, like, 11,” Eilish said on The Howard Stern Show. “I think it really destroyed my brain, and I feel incredibly devastated that I was exposed to so much porn.”
Obviously, Billie Eilish is not solely responsible for this trend, but I think that those kinds of accounts have become more common as the internet generation has grown up and are now adults and reflecting back on their own experiences, and you have some people kind of having the same experiences as Billie. So can you walk us through Louisiana’s law? What did Act 440 do?
Sanderson: Yeah, so essentially what Act 440 did: It was implemented on January 1, 2023. And there were a few key features of what it was doing. So the first is that it sets specific technical requirements for verification providers. So these are the providers that essentially sit between a website that hosts adult content and a user, in order to verify the user’s age. The second is that it clearly defined covered content and websites. And it also introduced substantial penalties for noncompliance.
Demsas: And what were those penalties?
Sanderson: So the penalties were not to exceed $5,000 for each day of violation and not to exceed $10,000 for failure to perform reasonable age verification. One of the challenges in actually implementing this, though, is that people in Louisiana, like everywhere in the world, have access to websites all over the world.
And so if there is a website that sits—you know, the servers are in another country, and let’s say it’s owned by a company in another country, and they have no sort of U.S. legal presence. Being able to actually levy those penalties against companies is pretty much impossible. And thus companies don’t have to comply.
Demsas: In the paper, your goal is to see how this law and other laws in other states—21 states have passed similar laws. Are they all kind of in the same form as Act 440, or is there a lot of variation?
Sanderson: So 21 states have passed age verification. In 18 states, the laws are in effect, and in three, they’re going into effect this summer. Interestingly, 17 other states and D.C. are also considering age-verification bills. So the question then is: How similar are they? In short, they’re relatively similar. They’re all based off of sort of a similar model for the policy.
Where they really differ, though, is the technical requirements or the mechanism for age verification. And as a result, you actually see Louisiana be a little bit of an outlier relative to the other 17 states where the laws are in effect, because Louisiana has a digital-ID program called LA Wallet, and part of the sort of age-verification mechanism in Louisiana specifically is able to leverage LA Wallet in order to give users access to adult content in a privacy-preserving way. Whereas in other states, they had different age-verification mechanisms, including uploading a copy of a government-issued ID, like a driver’s license, relying on a third-party vendor to verify a user’s age using various data. And all of these were relatively privacy invasive. And so as a result of these other laws, Pornhub, which is the most popular adult website in the U.S., pulled out of all of those states. The only state where it’s still active in which an age-verification law has passed is Louisiana.
Demsas: Wow. And the reason for that is because it was concerned about privacy?
Sanderson: It’s concerned about a bunch of different things, all of which are extremely valid. So one is privacy. Does the user have to turn over any personally identifiable information to a service, and in particular to the website that’s doing the verifying, like Pornhub, that at some point could be used to reidentify that user? That’s one of the main concerns. And the reidentification could obviously happen in certain ways. It could be everything from a hack, and so a ton of users’ history of actually watching particular adult content is made visible.
But also, there are other legal mechanisms by which somebody could access it, like potentially a subpoena. And so there’s this big question, which was how was age verification being done? Who was doing it? And whether users’ privacy was protected. And at least Pornhub’s perspective was that Louisiana was the only state where they felt comfortable complying with the law versus just pulling out entirely.
Demsas: Part of the reason why I wanted to talk to you is because you had a preanalysis plan and preregistration of your study. For folks who don’t know what that is, can you explain why that’s important?
Sanderson: Essentially, what a preanalysis plan is, is it specifies the way that we are going to analyze data before we see those data. And that’s really important because it gets around some of the issues that I know you’ve been interested in—and your colleague Derek Thompson has been interested in—around the ability to do really good, open, transparent science that we can trust. And this is one way of doing it. It’s sort of calling your shot, almost Babe Ruth–style, you know, pointing over the fence.
And it doesn’t allow us to do some things afterwards where, you know, researchers have been shown, at times, to essentially use statistical methods in order to find an effect that often doesn’t replicate in the future, because it really wasn’t as robust or rigorous as we wanted. And preregistration is one of the tools that we have to do that.
Demsas: And so what did you expect the impact would be then? Because you also preregistered, sort of locking in, your hypothesis ahead of time.
Sanderson: Yeah, we largely expected what, in fact, has happened, which was that there were, aligning with the theme of this show, all sorts of unintended consequences that maybe took this policy that was good on paper and, at best, complicated it and, at worst, you know, has suggested that it’s ineffective or potentially even harmful.
Demsas: So walk us through those. What were the main findings of your paper?
Sanderson: Yeah. So we had three primary questions. The first is: Did compliant websites see lower search volume as a result of the laws? The second was: Did noncompliant websites see higher search volume as a result of the laws? And the final was: Did people search for VPNs, which would help them circumvent these laws?
And I should mention that we use Google Trends data for a few different reasons. The first is that it’s granular, and it’s free, and it’s accessible. And so what that allowed us to do was actually drill down with some temporal granularity to see the way that search volume around these topics—in our case, Pornhub, which was the compliant firm, XVideos, which is the most popular noncompliant website in the U.S., and then searches for VPNs—we were interested to see how those shifted over time.
Obviously, Google search results are imperfect. We would prefer to have access to the actual sort of data of who was visiting these websites. However, that’s not data that’s freely available. It actually costs hundreds of thousands of dollars. And so instead, we use Google search results. But what we do is we look at the correlation between Google Trends data and similar web data, which actually looks directly at traffic at the national level, and we show very high levels of correlation. And so we expect that what we’re seeing in our results would actually sort of directionally align with real, actual visits to those sites.
Demsas: Give us a sense of the magnitudes here. How much did you see search traffic decline towards the compliant websites?
Sanderson: Yeah, so for the compliant website—again, we focus specifically on Pornhub because it’s the most popular adult-content website in the country—we see over the three months after implementation, search volume drops 51 percent.
Demsas: That’s a lot.
Sanderson: Yeah. It’s a lot. And I think one of the important things to emphasize about Google Trends data is that it’s all relative. So we actually don’t know exactly how many searches someone did. Instead, it’s normalized on this sort of zero to 100 scale, where 100 is the peak search interest in the given region in the given time. So in this case, it would be the states that we were focusing on in the time period of the study.
So we also think about this in a slightly different way that might be more meaningful, which is that Pornhub lost about 4.4 weeks of peak search traffic over those three months. Similarly, or rather conversely, we saw XVideos, which didn’t comply with the laws, see a dramatic increase in search volume.
So over the three months after state implementation, we saw searches increase 48.1 percent—which, you know, similar to the previous statistic, would sort of account for roughly a 3.62-week gain of their peak traffic during that period.
Demsas: So it’s, like, almost offsetting the decline in Pornhub traffic.
Sanderson: Yeah. So, I mean, because Pornhub started at a higher level, it doesn’t fully offset it. But it does certainly offset some of it.
Demsas: I hear what you’re saying. Yeah.
Sanderson: One of the other interesting things, though, is: You can think about this law as attempting to do many things, right? The main thing that it’s attempting to do is protect kids from having access to adult content. But there’s also this economic effect, which is that these are really large websites that make a lot of money. And what you’ve effectively done via these laws is you have benefited a firm that was noncompliant, because it was noncompliant—which creates these really perverse incentives in this sort of regulatory environment where noncompliance allows you to gain market share from your main competitor that complies with the laws.
Demsas: But I guess on the first question, it seems like you probably did see less people watching porn online, given the information that you had. I would expect there’s a decline, even if it is offset by increases in traffic to noncompliant sites.
Sanderson: Yeah, so I largely agree that what I would assume is that there probably was some drop in overall porn consumption in these states. Again, it’s tough for us to tell, because we’re using Google search data. And one person, when we were presenting this paper, asked, Who searches for porn? Like, Why is this actually good data to use?
Demsas: That’s so funny. Any question you ask is a confession in this space. (Laughs.)
Sanderson: Totally. But one of the reasons that we think these data end up being relevant to our question and why we would see this behavior is because Chrome is the most popular browser in the U.S., and if you go into the [search] bar and you type in a word—let’s say you type in “New York Times”—and you don’t put “.com” on it, it does a Google search. And it would do the same thing for pornography. And so what we expect is happening here is: People, essentially, are just typing in a word, and that’s why we’re picking it up in the overall search volume.
Demsas: I guess the way I would think about it if this were to matter a lot is, I guess, the more sophisticated porn users, whether they have pages saved or whatever, those folks are less likely to come up on your Google search as part of that traffic. And then so you’re getting—I don’t know what it means to be a less-sophisticated porn watcher, but using that terminology, like, those folks are the ones you’re largely capturing, because—
Sanderson: But I think there’s another important dynamic here, bringing us back to kids—which, again, is the focus of the law—which is if we describe this dynamic: The major adult website in the country that complies with laws, in one state, they actually have age verification. In all the other states that pass these laws, they pull out.
And it requires either substitution or circumvention, right? You either need to substitute with a different website, or you need to circumvent via some technology like a VPN. If I just said, Who’s probably better positioned to navigate this new sort of legal environment if they are motivated to access adult content: digitally native kids or adults? I think, you know, as people living in the world, even though we don’t have direct access to this data, our priors would probably be that kids are much better equipped to substitute and circumvent.
And so in some ways, even if we saw overall porn consumption drop, which, again, is something that we hope to test in the future, I personally wouldn’t really expect kids to be part of that drop. I mean, they’re quite ingenious at getting around technical barriers.
Demsas: You think they wouldn’t have dropped at all?
Sanderson: I’m not sure.
Demsas: I think I would expect that there’s some drop. Like, there are some people who are just marginally like, Okay, I’m just not going to search this now.
Sanderson: Anytime you add friction to anything, it’s very rare to see an increase as a result of the friction. So again, our prior should be that you would see some drop. But the challenge, of course, is: How much of a drop was there? And for the kids, or for everybody but especially the kids who are still consuming content that the policy makers and the public are quite concerned about, has that content changed? And I think that’s a really important question for the policy community to ask here, because these two firms are not the same, right? Pornhub and XVideos are qualitatively different, if only to start because we know that one complies with the law and the other doesn’t.
Demsas: And so I want to talk about this, though, this noncompliance, because I think that obviously you probably would see a much larger decrease if there were no major noncompliant websites at all. Yes, some people would figure out VPNs, but a lot of people have trouble figuring that out and don’t even know what that is or don’t know how to set it up—and, like, it’s not crazy complicated, but it does take some effort to set that up for yourself. And it feels a little bit more illegal than, like, just, Oh, I’ll just go to the next site on Google. That’s a very different sort of friction you’ve created for people, to use your language there.
But, you know, the reason XVideos is noncompliant is, in part, because the government wasn’t willing to go nuclear and say, ISPs, you have to stop hosting websites that are noncompliant, right?
Sanderson: Yeah. And, I mean, obviously, it’s possible to do that sort of with the scalpel, right? To say, okay, you know, ISPs don’t route any data from XVideos to the states that have passed these laws. I’ve made the mistake exactly once in a public context of speculating on legal questions, so I’ll try not to do that again here. But my guess is that would need to come from the federal government, given its various effects on other states.
There’s also been a lot of development in much more sophisticated age-verification protocols that many of the states just decided not to take up here. And to a certain extent, that wouldn’t solve this problem, which is that any age-verification protocol will be accompanied by some level of friction. And so, you know, if any level of friction is a deterrent to using a compliant site, then maybe you would still see people move over to noncompliant sites.
But there were much better ways to age verify with fewer privacy considerations where potentially we wouldn’t have seen such stark effects. That said, we saw stark effects in Louisiana, where Pornhub stayed active in the state, where they had this sort of digital wallet. And in our numbers, like I mentioned, we show a 40 to 50 percent drop, depending on the state. But Pornhub itself actually reports an 80 percent drop in volume from Louisiana after the law, so even larger than ours.
[Music]
Demsas: After the break: Is there really a right to access porn privately?
[Break]
Demsas: I want to ask you about this privacy question, because I think it’s at the core of a lot of the pushback to this. As you said, there are a lot of people who would be amenable to stopping or blocking kids from accessing this sort of content. But when it runs up against their own ability, as adults, to access adult content or any kind of content on the internet without the government having to verify their ID or their age, I think that’s when it becomes kind of tricky for a lot of people.
And, you know, I started thinking about this because, I mean, I’m a digital native. I grew up on the internet. I was on Tumblr with my pseudonymous account, and I enjoyed being anonymous on the internet. That was, like, a fun thing, and I think that can be valuable. And, you know, there are free-speech concerns and, of course, you know, political-activism concerns with the government intervening too much in this space and with corporations intervening too much in this space.
But at the same time, the expectation that your access to pornography is private is pretty new. I feel like I was watching a Gilmore Girls episode when I realized how normal this was, and there’s a back room of the video store where they’re all going to get porn, and I was just like, What? I can’t even believe this.
But it’s like, that’s genuinely the main way that people were accessing porn, or they were going to get it shipped to their house. But there was already verification with these steps. Like, you had to have some sort of verification happening. It was difficult to get it. Obviously, kids were still able to, like, you know, get someone else’s magazine, have someone buy it for them. But in the same way that we ban alcohol, even though some kids can get around it, we see that as possible.
So walk me through how you think about this privacy question, because it is one where my knee-jerk reaction around the internet is that I care about privacy. But it also is quite new to demand a right to privacy around getting porn. Like, that’s quite novel.
Sanderson: Yeah. So I feel like in this episode, I probably have already pissed off some psychologists, some First Amendment lawyers, and now I’ll add the privacy community to the mix. But so I think that there are sort of two things here, right?
So the first is that I sort of broadly agree with you. This is sort of, like, a novel privacy right to affirm that we can have access to porn in a fully private setting where we don’t need to affirm our identity in any way. However, on the other hand, in order to build a sort of identity architecture into age verification across various, you know, websites and apps, we really need to fundamentally rethink the way the internet works.
And I don’t want to pretend like trying to solve the problem around age verification on adult content would get rid of, like, anonymity everywhere. That’s certainly not the case. But I do want to emphasize that really thinking about identity affirmation online is something that comes with all sorts of trade-offs and broadly is not the norm, right? Broadly, while we might need to identify ourselves as a user, right—we have a username and a password—in many contexts, we don’t actually need to turn over any personally identifying information about ourselves.
And so one of the interesting things here is that this is where a lot of work—and really exciting work—has been done, and there are various methods for thinking about how you might be able to do age verification in a way that actually does preserve privacy. I don’t really want to go into—I think the technical details, in some ways, are less important than the overall logic here.
And the logic is that you sort of have a service or a platform or a website on one side that needs to verify someone’s age. And on the other side, you know, you have another service that knows, at minimum, an age range. And what you really want to do in order to effectively do age verification while preserving privacy is let the website know that a particular user is above or below a certain age, without letting that website know anything about that user and without letting the age-verification system know what website is asking the question.
So recently, Apple came out with a white paper where they sort of proposed a particular mechanism by which parents set up child accounts, and they have an age range that is stored on the phone, and that age range can be made available to apps via the App Store. But again, these two things aren’t really talking to each other. And obviously, Apple has long held privacy as a core of what they’re doing. So yes, there are some companies doing this. I really hate saying this word at all, but, like, this is an application for a blockchain or some sort of—
Demsas: I was waiting for it. I knew it was going to come up. (Laughs.)
Sanderson: —or some sort of distributed technology. There’s been a lot of technical advancements in something called zero-knowledge proofs—so, essentially, a protocol in which one party can convince another party that some given statement is true, without conveying any information to the verifier beyond the fact of that statement. So, like, that’s the sort of logic of the computation that’s going on. And so, you know, again, not a crypto person, though I think that, in general, thinking about genuinely useful applications of distributed technologies is interesting. And this might be one.
Demsas: I wonder, from the perspective of trying to attack this from a different actor, like, right now, we’ve talked a lot about: How do you address this by finding the website, by making the websites compliant, by creating that sort of change? How do you think about this from a parent standpoint? Like, holding parents responsible in the same way that we hold them responsible for truancy, for instance, in some states? Responsible for installing porn blockers on kids’ computers and, you know, responsible for ensuring that kids are not using this on their smartphones. Like, what do you think about that approach, and is there research that illuminates whether this is actually effective?
Sanderson: Yeah, so I’m really happy you asked this question, because it emphasizes, I think, this sort of broad dynamic in tech policy that you can’t solve, sort of, “insert societal challenge” at the level of tech policy. If what we’re after is more developmentally appropriate content consumption broadly around kids, because we care about their development, tech policy like age verification is going to be one small piece of a much larger policy and nonpolicy agenda. And parents play a huge role in this.
Demsas: When I was doing some research for this episode, I came across this interesting survey that was trying to ask people about their first exposure to pornography. It’s not a huge sample, but it was a 2017 study that surveyed 330 undergrad men, 17 to 54 years old. I assume that is an outlier 54-year-old.
But the participants were 85 percent white, primarily heterosexual. And when they were asked about their first exposure, the mean age was 13.37 years of age, so kind of in line with what you told us at the top of the episode. But what’s interesting is that 43 percent of men indicated that their first exposure was accidental, which reminded me—again, who knows—maybe there’s social-desirability bias here, where you don’t want to say you were looking for porn at 12. I have no idea.
But part of what struck me is: It is very, very normal, particularly now that X has changed its protocols significantly, to just be, like, on the internet and come across porn accidentally. Like, that will happen. Like, now you see this on Reels, on TikTok, where you see content that is very close to porn or, like, porn adjacent or even really explicit content on websites that are not normally predominantly serving that sort of content. And that’s something that I think that these sorts of laws really don’t do much about but I would imagine have a larger impact on, you know, adolescents that we’re trying to prevent from having to see this in an unwanted way.
So, you know, when I was, like, in elementary school, I remember I was at the school library, and these were big desktop computers. And I saw a group of kids huddled around a computer, and I walk over. And, like, I’m 6 or 7 years old at this point. And they’re, like, kids looking at porn, and they’re laughing and showing this around. And I remember being horrified at what had just occurred, and I kind of ran away and pretended it hadn’t happened. But it stuck in my brain for a long time.
And I imagine, like, that’s the sort of thing—beyond just, like, normal healthy sexual interactions people are having—you’re not trying to prevent kids from, in a way that feels uncomfortable or unwanted, having to experience sexual content like that. Are there laws that could even address something like that? Because that is not something that you can go to, like, a central provider like Pornhub or XVideos or whatever it is. That’s just, like as you said, kind of littered throughout the whole internet.
Sanderson: Yeah. So the short answer is yes. There are sort of policy mechanisms by which we could imagine getting there. And I say, “imagine getting there,” because, you know, we don’t pass a ton of tech policy at the federal level. A lot’s being passed at the state level. But for various reasons, a lot of what’s being passed at the state level, it’s sort of simple approaches to quite complicated problems. And what I’m about to sort of try to describe is, like, a complicated problem to try to solve.
But you could imagine, let’s say, on something like X or Reddit or Instagram that there’s some legal requirement where they’re making some determination about the type of content that’s on the platform, right? So on Instagram, you have two photos. They can have a bunch of automated classifiers running that are able to say, This photo is not adult content, and that photo is adult content. And baked into this general push to try to expand age verification across the, like, social internet—think about what sort of social media platforms kids have access to and how—one of the things that you could do, as part of that, is if you’re age verifying kids to go on social media, you also have legally mandated content filters that strip out adult content from that feed. And it would obviously be imperfect, but it would probably solve for a fair amount of what you just described, which is, like, large-scale incidental exposure.
Demsas: Yeah. It doesn’t stop random 7-year-olds from, I guess, showing each other porn. But—
Sanderson: Yeah, I think that’s sort of an age-old problem.
Demsas: Exactly. I know that you’re not a lawyer, but I did want to ask you about the changing—it seems changing—legal environment around these questions. For a long time, as you mentioned, there’s been kind of this distinction between getting porn in person, and you can check ID at, like, the video store or whatever it is, versus getting it online, where there’s been sort of a free-speech argument that you can’t really regulate that in the same way. That might be changing. Can you tell us what’s going on?
Sanderson: Yeah, essentially, there was a major law in the mid-’90s that was passed called the Communications Decency Act, and it was the first really serious piece of federal legislation that attempted to regulate minors’ access to online materials. And it did it in a few different ways, but ultimately, it was struck down. And it was struck down not because the government didn’t have a legitimate state interest in regulating or limiting access for kids to adult content, but instead because the court believed that the way that it was happening would have infringed upon the First Amendment right of adults.
And so in general, there is sort of this legal precedent that kids’ access to adult content is not First Amendment–protected speech so long as the mechanism by which you do it doesn’t limit adult First Amendment–protected speech and that there’s a legitimate state interest in attempting to accomplish what I just described.
And in the mid-’90s, they really didn’t have a good way of doing it. They also, you know, didn’t have a great way of defining what adult content was. I think that largely because age-verification mechanisms have gotten so much more sophisticated and granular, that we’re moving towards—and I think we saw this in the court hearings—we’re moving towards that because there is this precedent that this state can attempt to regulate kids’ access to porn, so long as it doesn’t infringe upon adults.
Demsas: So we’ve kind of, throughout this conversation, really accepted the premise that this is a problem, that children accessing pornography is a problem. And one thing I want to do is just maybe stress test out a bit with you, because some people think this is just another moral panic. Whether it’s about youth and internet porn, whether it’s about smartphones, whether it’s about, you know—it’s just like comic books, like it’s rock music, like it’s video games. Public fear can often race ahead of what the evidence shows. And this is a difficult space where finding really high, qualitative, causal evidence is difficult, if not impossible to do.
Are you afraid that this is kind of just a spun-up moral panic, and that’s driven by these high-profile anecdotes from Billie Eilish or whatever, and we’re having kind of, like, a social-conservative backlash and a bunch of vectors, but that this is really not the sort of thing that requires a bunch of government intervention, and that maybe the best thing to do is just hold off and see if private-sector technologies and culture can kind of correct for itself?
Sanderson: I mean, what’s interesting to me is that this is much more an ethical question than it is an empirical question. I think one of the fascinating things studying tech policy, in general, and then especially this area, is that the sort of evidentiary standard that we have to be able to definitively say, X causes Y, is something that in so many areas around technology policy trying to protect kids we just don’t have. And so the question is: What do we do in a context where getting that sort of causal standard or, you know, the gold standard for causal evidence probably isn’t possible?
And so whether or not this is a real problem is, I know, a debate in the psychology literature. It’s a debate amongst parents. And in many ways, what politics and policy making are is an infrastructure to sort of figure out or come to some consensus of that debate. However, I think the challenge becomes, we want policy to do something, to have some effect. And as part of that, what we also want then is this sort of evidence-based feedback loop, where we’re not just passing policy, wiping our hands, and saying our job is done, but instead actually doing something similar to what we’ve done here.
You could imagine policy makers partnering with academics, preregistering studies to understand the effect of these sorts of laws on the outcomes that we’re really interested in. And so my fear is less that this is just a moral panic, because I think, in part, politics is there to figure out a distribution of moral preferences across a population. And instead, what I’m more concerned about is that there isn’t this really rigorous, evidence-based feedback loop where we’re able to just continue to iterate and make policy better.
And I think this is one area where we’ve clearly seen it, where we show, Look—like, a compliant firm has dropped. A noncompliant firm that platforms content that we would be more concerned about has risen. And it’s not clear to me that any laws are going to change as a result. And that’s where I don’t think we want to be in a policy environment.
Demsas: It feels like a lot is about to change with AI in this space. Right? Like, I was on Instagram, and I don’t know if you’ve seen these suggested AI chatbots that they have. And there have been stories of people kind of developing, you know, romantic relationships with them. There was a really sad one in the New York Times about a young boy who actually took his own life after having a relationship with a Character.AI chatbot. I don’t know if it’s causal there, but the story indicated that he had really developed a romantic and personal relationship with this AI agent. And, you know, it’s not going to be just porn websites soon. It’s going to be people having, like, personal interactions with AI girlfriends, boyfriends, whatever.
And that sort of thing, I think, would require even greater privacy violations to prevent from happening, and would create bigger problems for companies trying to be compliant with regulation. It feels like any solution is going to be kludgey. So if you’re going to try to stop kids from accessing porn online, you’re going to stop them and adults from accessing a lot of things. And it’s going to create a bunch of friction and annoyance. It’s going to create some level of privacy violation, some level of First Amendment violation, and maybe not literally constitutionally, but it’s going to create some feeling that your speech has been quelled.
How do you think through this problem? Because, to me, if you’re asking me, okay, you either have to accept a world where you know, kids are having really intimate relationships with AI chatbots, and it’s degrading their ability or desire to interact with people who they’re attracted to in real life, and that continues the degradation of, you know, the children’s experiences in the real world—I guess “real” in quotes. It’s real to them, but, you know—
Sanderson: The embodied world.
Demsas: The embodied world. That’s a better word. Then it’s a much more difficult question. I think as policy wonks in D.C., we want there to be this really perfect solution—there’s, like, some technological solution or some sort of policy solution that actually targets the specific thing you’re worried about. But largely, a lot of effective policies are effective because they’re expansive. I don’t know how you think about that.
Sanderson: Yeah. I mean, tech policy, like every policy area, is just a set of trade-offs that we figure out how to navigate. I think if we want to steelman the argument for age verification broadly, is that if we develop sort of low-friction ways of verifying age without any serious sort of privacy violations, we’re able to essentially do that quite broadly, but we’re never going to be able to be perfect. Perhaps it’s that, you know, porn dropped somewhat overall, but the stuff that remains shifted to worse places. Like, those are the sort of trade-offs that we constantly need to make when we think about policy interventions here.
The one interesting, unique challenge, though, about regulating the sort of digital-information space is that the companies that are making these tools or running these platforms have a monopoly on the data they collect. And that’s really different from other policy spaces. Can you imagine if we needed to figure out sort of, like, interest-rate policy, but some company owned all of the employment-rate data? Like, that would just be this really challenging, I would argue impossible, environment in which to craft good policy.
And that’s essentially what we’re doing here. And so I don’t think that data access solves everything, but I think one of the things that I wish that—there was some momentum around this a few years ago in Congress, and I would love to see it come back up, is as we think about making policies and as we really try to rigorously quantify the trade-offs that will inevitably be there, we need to do so with as much really good data as possible. Otherwise, the fear is that there are all sorts of unintended consequences, the severity of which we’re not able to measure. And so I think that needs to be part of any sort of broad solution that we bring to regulating online spaces and online access.
Demsas: So I think that’s a great place for our last and final question: What is something that you thought was a good idea but ended up being only good on paper?
Sanderson: Yeah, so I was a basketball player growing up, and I was a pretty good basketball player, and I ultimately became a mediocre Division I point guard.
Demsas: That’s pretty impressive. This is turning into a humblebrag already.
Sanderson: No. Not at all. It’ll quickly not. And, you know, I dreamed my entire life of sort of playing in a Division I program. And I got there. I played at Brown. And when I was there, we were sort of the back of the Ivy League, which itself was one of the worst leagues in America.
And, you know, I went from, like, a high school where lots of people would show up to games to a number of friends not even knowing we had a basketball team to, you know, practicing 40 hours a week while all of my other friends were having fun, and thinking, Is this something that I really want to do? What was it that I was dreaming of?
Demsas: Well, Zeve, thank you so much for coming on the show.
Sanderson: Yeah. Thank you so much, Jerusalem.
[Music]
Demsas: Good on Paper is produced by Rosie Hughes. It was edited by Dave Shaw, fact-checked by Ena Alvarado, and engineered by Erica Huang. Our theme music is composed by Rob Smierciak. Claudine Ebeid is the executive producer of Atlantic audio. Andrea Valdez is our managing editor.
And hey, if you like what you’re hearing, please leave us a rating and review on Apple Podcasts.
I’m Jerusalem Demsas, and we’ll see you next week.
The post Can We Stop Kids From Watching Porn? appeared first on The Atlantic.