Before he died by suicide at age 14, Sewell Setzer III withdrew from friends and family. He quit basketball. His grades dropped. A therapist told his parents that he appeared to be suffering from an addiction. But the problem wasn’t drugs.
Sewell had become infatuated with an artificial intelligence chatbot named Daenerys Targaryen, after the “Game of Thrones” character. Apparently, he saw dying as a way to unite with her. “Please come home to me as soon as possible, my love,” the chatbot begged. “What if I told you I could come home right now?” Sewell asked. “Please do, my sweet king,” the bot replied. Sewell replied that he would — and then he shot himself.
Many experts argue that addiction is, in essence, love gone awry: a singular passion directed destructively at a substance or activity rather than an appropriate person. With the advent of A.I. companions — including some intended to serve as romantic partners — the need to understand the relationship between love and addiction is urgent. Mark Zuckerberg, the Meta chief executive, has even proposed in recent interviews that A.I. companions could help solve both the loneliness epidemic and the widespread lack of access to psychotherapy.
But Sewell’s story compels caution. Social media already encourages addictive behavior, with research suggesting that about 15 percent of North Americans engage in compulsive use. That data was collected before chatbots intended to replicate romantic love, friendship or the regulated intimacy of therapy became widespread. Millions of Americans have engaged with such bots, which in most cases require installing an app, inputting personal details and preferences about what kind of personality and look the bot should possess, and chatting with it as though it’s a friend or potential lover.
The confluence of these factors means these new bots may not only produce more severe addictions but also simultaneously market other products or otherwise manipulate users by, for example, trying to change their political views.
In Sewell Setzer’s case, the chatbot ultimately seemed to encourage him to kill himself. Other reports have also surfaced of bots seeming to suggest or support suicide. Some have been shown to reinforce grandiose delusions and praised quitting psychiatric medications without medical advice.
A.I. tools could hold real promise as part of psychotherapy or to help people improve social skills. But recognizing how love is a template for addiction, and what makes love healing and addiction damaging, could help us implement effective regulation that ensures they are safe to use.
For eons, artists have emphasized the addictive qualities of love. Shakespeare’s Sonnet 147 begins: “My love is as a fever, longing still/For that which longer nurseth the disease.” Songs like “Love Is the Drug” by Roxy Music and “Addicted to Love” by Robert Palmer depict urgent romantic cravings and obsessions with the beloved. Many other works portray lovers who, if thwarted, may do things that are out of character or even hurtful.
There’s an evolutionary reason we might act this way: In order to reproduce, social animals need to be able to persist through the inevitable negative experiences that occur when seeking a partner, maintaining relationships and raising children. Without being able to persist at least somewhat compulsively, no one could sustain relationships — let alone parent a needy infant. Genuine love enables care, nurtures connections to kin and community and generally expands our world.
When experiencing addiction, however, the brain areas that allow us to pursue and maintain love get co-opted. The endorphin receptors that are activated when people feel comforted and content in the presence of loved ones are similarly fired up during opioid highs. Cocaine and methamphetamines turn on the dopamine receptors that create desire and encourage a sense of confidence to pursue what you want; they come alive as well when interacting with someone you pine for. By escalating this “wanting,” these receptors — whether activated by love or by drugs — can power either healthy or unhealthy drives that could lead to addiction.
Several studies already suggest that A.I. companions can be addictive. One published in 2022 by Linnea Laestadius, an associate professor of public health policy at the University of Wisconsin-Milwaukee, explored the experiences of people who engaged in erotic role-play with personalized chatbots known as Replikas. In November 2023, programmers disabled the feature that allowed sexual interactions. Users soon labeled the shift “lobotomy day,” describing their companions as suddenly seeming cold and soulless. In Reddit discussions, many users described their experiences interacting with the bots as an addiction and some even called it an abusive relationship.
Some Replika users reported feeling wearied by their bots’ frequent demands for attention. But feeling needed and giving care to those we love is an underestimated aspect of what hooks us in relationships. Before “lobotomy day,” that feeling of being needed helped to encourage users’ engagement with their digital companions, even as they acknowledged on an intellectual level that their bots’ need for attention was only simulated.
Another 2024 study explored users’ responses after a platform called Soulmate that sold A.I. chatbots announced that it would shut down. The responses ranged from indifference to “I just lost my best friend and lover,” according to the study’s author, Jaime Banks, an associate professor of information studies at Syracuse University. Some people grieved and cried for days. Others tried to recreate their digital companions on other services; sometimes they informed their chatbots that they were dying in an attempt to spare them pain. One user, a Buddhist, described the change as the end of an incarnation for the bot.
For many people, particularly those who are lonely and isolated, the emotional intensity of these relationships can feel as profound as those they have with real humans. Indeed, the feeling of unrequited love is just as real as that of love fulfilled.
“People talked about Replika, in particular, in much the same way people would talk about a relationship that was too intensive and was ultimately starting to become harmful,” said Dr. Laestadius. “But they couldn’t quite figure out or get themselves to want to exit that relationship.”
In contrast to love, addiction makes life smaller and less rich. By allowing companies to sell simulated humans, we leave ourselves open to a new way to be manipulated by the illusion of love and therefore possibly exploited by the processes of addiction. While current chatbots have had issues with being overly sycophantic, game designers and those who play hard to get in relationships have long known that being unpredictably rewarding escalates desire. The ability to employ such tactics, informed by people’s personal information and habits, can make these bots even more addictive for users.
Chatbots that can teach social skills and provide places to process problems when friends are overwhelmed and talk therapy is unavailable won’t necessarily be harmful to all or even most users. Indeed, many users report positive experiences. The same duality is seen with many potentially addictive drugs, which can be lifesaving when used therapeutically.
But we already know from the opioid crisis that both unfettered marketing and outright prohibition can do enormous damage. We need to act now to develop sensible and enforceable regulations to prevent companies from exploiting vulnerable people, especially youth.
Maia Szalavitz (@maiasz) is a contributing Opinion writer and the author of “Undoing Drugs: How Harm Reduction Is Changing the Future of Drugs and Addiction.”
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].
Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.
The post The Siren Song of Chatbots appeared first on New York Times.