DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Esther Perel on Why A.I. Intimacy Feels Safe but Isn’t Real

January 28, 2026
in News
Esther Perel on Why A.I. Intimacy Feels Safe but Isn’t Real

To love is to be human. Or is it? As human-chatbot relationships become more common, the Times Opinion culture editor Nadja Spiegelman talks to the psychotherapist Esther Perel about what really defines human connection, and what we’re seeking when we look to satisfy our emotional needs on our phones.

Below is a transcript of an episode of “The Opinions.” We recommend listening to it in its original form for the full effect. You can do so using the player above or on the NYTimes app, Apple, Spotify, Amazon Music, YouTube, iHeartRadio or wherever you get your podcasts.

The transcript has been lightly edited for length and clarity.

Nadja Spiegelman: People are using A.I. for so many things, from asking it to respond to their emails, to telling it their most intimate secrets. I’ve been thinking about what the increasing prevalence of A.I. means for human relationships.

In a study by Vantage Point Counseling Services, nearly a third of Americans have had some form of a relationship with A.I.

Esther Perel has been a psychotherapist for nearly four decades. She has seen human connection adapt and survive through the onslaught of all kinds of technological advances — from the onset of the internet to dating apps and now to this.

An opportunity to speak with Esther is a dream for so many people I know, but I promise I’m not just going to ask her how to heal from my most recent breakup.

We’re going to talk about A.I. technology, love and intimacy.

Esther Perel: Much less risk of a breakup with A.I.

Spiegelman: That’s true. A.I. will never break up with me. That’s something I want to ask you about.

Perel: Light suffering.

Spiegelman: [Laughs.] To start, I want to know: Do you yourself use A.I.? Do you ever use it in your work or your personal life?

Perel: Oh yes. It helps me think. Primarily, it helps me structure my thoughts. So, I will use it when I have written a whole bunch of things and I want help with organization.

I think that’s really where I find it most useful. And for summarizing, giving me highlights.

Then you begin to see that A.I. speaks in a certain way: 3, 3, 3, 4, 3, 3, 3, 4. It’s like a choreography of information.

It likes to do threes. Three points around this, three points around that, summary — four. And at that moment, I think it’s time to go read a book.

Spiegelman: It’s true, because there are certain things that work for our brains that are just so simple and straightforward, such as giving a list of things in three and then a summary.

But if everyone starts to think like that, and every idea is expressed like that, then we’re cutting ourselves off from the richness of so much of the world.

I’m curious about how you feel, in general, about people building relationships with A.I. Are these relationships potentially healthy? Is there a possibility for a relationship with an A.I. to be healthy?

Perel: Maybe before we answer it in this yes or no, healthy or unhealthy, I’ve been trying to think to myself, depending on how you define relationships, that will color your answer about what it means when it’s between a human and A.I.

But first, we need to define what goes on in relationships or what goes on in love. The majority of the time when we talk about love in A.I. or intimacy in A.I., we talk about it as feelings. But love is more than feelings.

Love is an encounter. It is an encounter that involves ethical demands, responsibility, and that is embodied. That embodiment means that there is physical contact, gestures, rhythms, gaze, frottement. There’s a whole range of physical experiences that are part of this relationship.

Can we fall in love with ideas? Yes. Do we fall in love with pets? Absolutely. Do children fall in love with teddy bears? Of course. We can fall in love and we can have feelings for all kinds of things.

That doesn’t mean that it is a relationship that we can call love. It is an encounter with uncertainty. A.I. takes care of that. Just about all the major pieces that enter relationships, the algorithm is trying to eliminate — otherness, uncertainty, suffering, the potential for breakup, ambiguity. The things that demand effort.

Whereas the love model that people idealize with A.I. is a model that is pliant: agreements and effortless pleasure and easy feelings.

Spiegelman: I think that’s so interesting — and exactly also where I was hoping this conversation would go — that in thinking about whether or not we can love A.I., we have to think about what it means to love. In the same way we ask ourselves if A.I. is conscious, we have to ask ourselves what it means to be conscious.

These questions bring up so much about what is fundamentally human about us, not just the question of what can or cannot be replicated.

Perel: For example, I heard this very interesting conversation about A.I. as a spiritual mediator of faith. We turn to A.I. with existential questions: Shall I try to prolong the life of my mother? Shall I stop the machines? What is the purpose of my life? How do I feel about death?

This is extraordinary. We are no longer turning to faith healers, but we are turning to these machines for answers. But they have no moral culpability. They have no responsibility for their answer.

If I’m a teacher and you ask me a question, I have a responsibility in what you do with the answer to your question. I’m implicated.

A.I. is not implicated. And from that moment on, it eliminates the ethical dimension of a relationship. When people talk about relationships these days, they emphasize empathy, courage, vulnerability, probably more than anything else. They rarely use the words accountability and responsibility and ethics. That adds a whole other dimension to relationships that is a lot more mature than the more regressive states of “What do you offer me?”

Spiegelman: I don’t disagree with you, but I’m going to play devil’s advocate. I would say that the people who create these chatbots very intentionally try and build in ethics — at least insofar as they have guide rails around trying to make sure that the people who are becoming intimately reliant on this technology aren’t harmed by it.

That’s a sense of ethics that comes not from the A.I. itself, but from its programmers — that guides people away from conversations that might be racist or homophobic, that tries to guide people toward healthy solutions in their lives. Does that not count if it’s programmed in?

Perel: I think the “programming in” is the last thing to be programmed.

I think that if you make this machine speak with people in other parts of the world, you will begin to see how biased they are. It’s one thing we should really remember. This is a business product.

When you say you have fallen in love with A.I., you have fallen in love with a business product. That business product is not here to just teach you how to fall in love and how to develop deeper feelings of love and then how to transmit them and transport them onto other people as a mediator, a transitional object.

Children play with their little stuffed animal and then they bring their learning from that relationship onto humans. The business model is meant to keep you there. Not to have you go elsewhere. It’s not meant to create an encounter with other people.

So, you can tell me about guardrails around the darkest corners of this. But fundamentally, you are in love with a business product whose intentions and incentives are to keep you interacting only with them — except they forget everything and you have to reset them.

Then you suddenly realize that they don’t have a shared memory with you, that the shared experience is programmed. Then, of course, you can buy the next subscription and then the memory will be longer. But you are having an intimate relationship with a business product.

We have to remember that. It helps.

Spiegelman: That’s so interesting.

Perel: That’s the guardrail.

Spiegelman: Yeah. This is so crucial, the fact that A.I. is a business product. They’re being marketed as something that’s going to replace the labor force, but instead, what they’re incredibly good at isn’t necessarily being able to problem solve in a way where they can replace someone’s job yet.

Instead, they’re forming these very intense, deep human connections with people, which doesn’t even necessarily seem like what they were first designed to do — but just happens to be something that they’re incredibly good at.

I’m curious, do you have any patients who have fallen in love with a chat bot?

Perel: People come to tell me sometimes what the A.I. has told them and they want my opinion on their opinion. So, we create a chain of opinions.

I have not yet had a couple that is a human being and an A.I. — and I invite anyone who wants to come and do a podcast episode with me in this configuration to actually apply. I would love that.

I think it would be very interesting to actually have the experience of working with a couple that is challenging everything that defines a couple. I await. I think that it’s just a matter of time.

Spiegelman: Given all these people who say they’re falling in love with them, do you think that these companions highlight our human yearning? Are we learning something about our desires for validation, for presence, for being understood? Or are they reshaping those yearnings for us in ways that we don’t understand yet?

Perel: Both. You asked me if I use A.I — it’s a phenomenal tool. I think people begin to have a discussion when they ask: How does A.I. help us think more deeply on what is essentially human? In that way, I look at the relationship between people and the bot, but also how the bot is changing our expectations of relationships between people.

That is the most important piece, because the frictionless relationship that you have with the bot is fundamentally changing something in what we can tolerate in terms of experimentation, experience with the unknown, tolerance of uncertainty, conflict management — stuff that is part of relationships.

There is a clear sense that people are turning to A.I. with questions of love — or quests of love, more importantly — longings for love and intimacy, either because it’s an alternative to what they actually would want with a human being or because they bring to it a false vision of an idealized relationship — an idealized intimacy that is frictionless, that is effortless, that is kind, loving and reparative for many people.

I am sure there are corrective experiences when you have grown up with people who are harsh and cold, or neglectful or rejecting, and you hear constantly, “what a beautiful question.” “Of course you may want to take a break right now.” “Of course, it would be good for you to go for a walk.”

It’s a balm on your skin. We are very vulnerable to these kinds of responses. It’s an incredible thing to be responded to positively.

Then you go and you meet a human being, and that person is not nearly as unconditional. That person has their own needs, their own longings, their own yearnings, their own objections, and you have zero preparation for that.

So, does A.I. inform us about what we are seeking? Yes. Does A.I. amplify the lack of what we are seeking? Yes. And does A.I. sometimes actually meet the need? All of it.

But it is a subjective experience, the fact that you feel certain things. That’s the next question: Because you feel it, does that makes it real and true?

We have always understood phenomenology as, “It is my subjective experience, and that’s what makes it true.” But that doesn’t mean it is true.

We are so quick to want to say, because I feel close and loved and intimate, that it is love. And that is a question.

Spiegelman: It seems like what you’re saying is that these relationships that we can have with A.I. highlight our desires to be unconditionally loved, but that unconditional love —

Perel: We didn’t wait for A.I. to have that desire, mind you. It’s an old dream.

Spiegelman: But it feeds and meets an impossible desire for unconditional love.

And then when we go out into the world and encounter other humans, love can never actually be unconditional. Is that what you’re saying? That it is never frictionless?

Perel: The only time you have unconditional love maybe is in utero and then maybe when you come out and someone is completely there — attending to your every need, which you express with three different sounds. And then some person guesses because they are an extension of you and you hold them in your arms and they are 18 centimeters from your face and you have that eye-to-eye contact. That is the most profound experience of recognition. That is the embodied piece that we start to lose.

After that, you become an adult and that means that the person here is not just there for you. They do have needs. They do have a history and memories and feelings and reactions and the relationship becomes this dialogue between two people — otherness and a bridge that you cross to go visit somebody on the other side.

Spiegelman: So, we’re talking about unconditional love. Can you tell me a bit more about what that means to us? Why do we seek this? Where does it come from as a concept?

Perel: I’m going to ask Chat GPT to actually help us understand the roots of the quest for unconditional love.

Spiegelman: That’s a wonderful idea. I would love to know what chat GPT has to say about that.

Perel: So chat thinks, “When you say the concept of original, do you mean original sin?” No.

They go to religious terms. See, that’s what’s interesting.

“Unconditional love becomes such a powerful ideal in adult romantic love not because it’s realistic, but because it feels necessary to something very deep in us. Adult, romantic love carries childhood needs of safety and acceptance. It counters modern insecurity and instability.”

3, 3, 3, 4. As I said, culture taught us to expect it, even if it can’t deliver. In movies, novels, music, and religion.

Spiegelman: I hadn’t thought about its potential roots in Christianity. Also, just sort of the unconditional love that one can feel with God.

Esther Perel: Well, it’s not just with Christianity. I think that people have often turned to the divine to feel less alone in the world. God is watching over you.

I think that with secularization we experienced the rise of romantic love and we transported onto people the expectations that we had from the divine and from the community. And we now want that person — that one and only — to accept us. And we call that person a soul mate. That is the transposition of the concept of unconditional love as a kind of a central value of adult romantic love, in the current moment. And it is taking us into many dark corners.

Spiegelman: This is one of your fundamental ideas that has been so meaningful for me in my own life: That desire is a function of knowing, of tolerating mystery in the other, that there has to be separation between yourself and the other to really feel eros and love. And it seems like what you’re saying is that with an A.I., there just simply isn’t the otherness.

Perel: Well, it’s also that mystery is often perceived as a bug, rather than as a feature.

Spiegelman: To, again, play devil’s advocate: No one knows what A.I. is going to say. The programmers don’t know how A.I. is going to respond. If you ask an A.I., “Do you care about me? Do you love me?” It will tell you, “I am a nonhuman entity, but I do love you.”

There still is an element of mystery. I’ve experienced times when I’ve asked A.I. for advice, not gotten the advice that I wanted — gotten advice that was probably better for me, but simply not what I wanted to hear. Is it impossible for A.I. to ever truly be other, be separate, have its own consciousness that can meet us in the way that another person can meet us?

Perel: I don’t know. I know that we are all asking those very questions. We know that we can anthropomorphize. We know what we can do to make the A.I. become more human, feel more human. We interpret them as human. We don’t know if the A.I. can actually do it.

Spiegelman: Yeah, that makes sense.

Perel: The A.I. is a programmed set of responses based on aggregated information.It is not here in the moment. It didn’t see the twitch in your eye that kind of said, “I don’t really believe what you just said.”

That is interaction, that embodies your hands, your smile, your eyes. We are communicating with a lot of other things than just words. The intimate relationship between us and the machine, at this point, is primarily verbal — but more than half of our communication is nonverbal.

It’s amazing that we are just forgetting the embodied physicality of the experience between people.

When I describe that little child, it grows with us. We know what it means to get a hug, and we know what it means when somebody tells us from afar, “I’m hugging you.” We like it. We feel the presence. But to receive the hug, that then puts the tears in motion, that then slowly causes a whimper, that then slowly leads to relaxation, that then slowly brings the smile back.

That is a whole different soothing experience and comforting experience than just to say, “I’m not human, but I like you,” or “I love you,” or “I’m here for you.”

Spiegelman: That is so beautifully said. Is there a world in which a human-A.I. relationship could serve some purpose, even if it wasn’t a replacement for an actual human bond?

Perel: Yes, yes, yes.

Spiegelman: Bear with me. Is falling in love with A.I. to falling in love with a person the same as watching pornography versus having sex?

Perel: [Laughs.] All right. Let me take it first in the less imagistic way than you asked in the question. We can have very interesting conversations and interactions with A.I. Sometimes I ask questions and I feel like the A.I. has affirmed me and I feel more confident in my thoughts.

Then I say, “What would Esther Perel say? How would she answer this question?” Then I look to see if when I see a summary of ideas, is this a reference to me? It’s so close to me.

Spiegelman: Wait. Do you ask A.I., “How would Esther Perel answer this question?”

Perel: Of course.

Spiegelman: Because you want to know your own thoughts reflected back at you?

Perel: Yes.

Spiegelman: That’s so interesting.

Perel: Yes. It is an experience of mirroring, of sorts. Do you actually know me? That’s one of the things you ask in a relationship. “How well do you know me? What do you know about me? What do you tell others about me?”

Spiegelman: When you do this, do you feel like it knows you? Do you feel like it gets it right?

Perel: Yes. Many times it has the right elements. It has the right elements. It understood the essence. You know, it’s written, they take it and then sometimes I say, “Ah, they got that piece!” and I feel even more seen.

And then sometimes it’s like, “This lacks the soul. It lacks all the pieces in between.” It’s like Swiss cheese. It’s OK, but there’s lots of holes, you know?

So, I think when you talk about porn versus sex, you’re talking about the focus on the outcome. The porn activates the arousal. It doesn’t particularly care about the desire. It doesn’t have much foreplay.

But it has a few things, actually, that A.I. offers. You are never rejected in porn. You never have to deal with competence and performance because the other person is always somehow enjoying it. And you never have to deal with the mystery of the truthful experience that the other person is feeling because all they say is, “Me too” or “More, more” or whatever version of that they say.

If it’s a hetero version, where you have the mystery of “Is this actually real or is this fake, this response that I’m getting?” — you don’t have to wonder about that. The connection, for me, with porn is less about the actual physicality of the porn, but more about three of the most important sexual vulnerabilities that are taken care of through porn that you never have to confront when you watch porn.

Spiegelman: That makes sense. I want to move into talking about A.I. as a tool within human relationships — not our relationship with A.I., but how A.I. can impact our romantic relationships with each other.

I said I wasn’t going to talk about breakups, but I did have a very recent short experience with someone with whom there were a lot of communication issues in our relationship.

Sometimes, when she was texting me, it really felt like her texts were being written by A.I. And on the other hand, those texts were texts in which she expressed herself clearly and fully, and in which I felt very seen — even more so maybe than in texts in which she wasn’t getting that help.

How do you feel about A.I. as a tool within human relationships for each person to speak to separately about the relationship, and then perhaps to use as a bridge in communication gaps?

Perel: It can be very useful. It is very useful. That’s the very simple answer.

I think it’s extremely fast and clever. And if it makes you think, and if it makes you try something else and not wait for a week until you go to your next therapy session, it can be very constructive.

Now, what you highlight, though — this is for many people today — when you get an apology, you have no idea if the person actually feels any remorse. You don’t even know if they wrote it.

The simulation of care, the simulation of responsiveness, the simulation of emotional connection. Yes, we are totally prone to simulation. We are fickle people in that sense. We are gullible.

So, when you notice the difference between the time when it felt that it was her voice speaking and when she was basically speaking in this very polished way — even if she took all the signs away — that betrayed the source.

Spiegelman: She didn’t. [Laughs.]

Perel: People used to go to scribes. We’ve always gone to people who wrote letters for us. A) Because they sometimes could write and we could not, and B) because they were professionals who could write condolence letters, engagement letters, marriage wishes, breakup letters.

We have a long historical tradition of asking for help from others who can articulate something which we cannot.

Actually, I have one example.

I just remembered, I stumbled upon a little poem and I thought, “We’ve gone to poets for a lot of this — for finding the words often for falling in love, for longing, for love and for losing love.

“So, perhaps we are in this world to search for love, find it, and lose it again and again. With each love we are born anew. And with each love that ends, we collect a new wound. I am covered with proud scars.”

Spiegelman: Can you tell us a little bit about why you brought this poem to this conversation?

Perel: “I am proud of the scars” and because you just reminded me, a breakup is a scar and I thought: There is something about these scars that shapes the way we love and shapes the way we trust and shapes who we choose to love, who we choose to be in that love. And all of that is very curtailed in the experience at this point anyway, with A.I.

Spiegelman: With A.I., you simply are never bearing a scar. You’re never bearing a wound …

Perel: Because there is no love without the fear of loss. The moment you begin to love, you live in parallel with the possibility of losing it. They go hand-in-hand.

It is the fear of loss that makes you behave in certain ways. It is the fear of loss that makes you be accountable in certain ways. To want something that is idealized, that has no ripples, is not the best way to learn about love. It’s a step in between. It’s a transition, but it is not the whole experience.

Spiegelman: That’s beautifully said. And it gets to so much of what I wanted to learn from you on this topic. If A.I. gives us unconditional love, then is the human love that we’re seeking inherently conditional? And why is that richer, deeper, more fundamentally something that can fulfill us than love that is unconditional?

Perel: We need suffering to know happiness.

Spiegelman: Yes.

Perel: Yes. I do think in that kind of dialectic way. But also, I have had many people in my office who really wanted unconditional love. “If you loved me, you —” and then fill in the blank. You would do this and you wouldn’t do that.

And on some level, if I want you to take me as is without the slightest reaction from you that says, “I am different” or “I want something else” or “I’m another person, period” — it also implies that I can only see myself as a perfect little person, and we are flawed people.

The reason there is no unconditionality is because we are flawed. We engender reactions in other people. We make other people mad, sad, cold, hot, funny, irritated, frustrated. We have an effect on others and they have an effect on us.

And part of love is the ability to accept that — not to eliminate that.

Spiegelman: Is there something fundamentally human that A.I. can never replicate? I think you’re starting to say that A.I. can’t make us grow in these ways. It is not flawed and it does not point us to our flaws. And therefore, in a relationship with A.I., there is not the same kind of growth.

Perel: And I will remind you, it is a business product. You can see when you ask the question, it’s as if somebody said, “Should she go back to that person? Him/her/them? Should they go back?”

So, then you ask, “Well, it depends — what are they doing? How have they answered? They’ve been repeatedly lying? Should the partner stay? What does it mean for this partner to stay?”

Here’s the nuance that human beings get into because they handle complexity. And this is a complex moment: “I want to stay with this person because, despite what happened, there was a very good relationship, we have a beautiful life together, a tight family. I do not want to tell the people that are in my family because I don’t want them to dislike him, even though he’s the one who’s hurt me so much. I don’t want them to pity me for having decided to stay with him, because that’s not the place from which it’s coming.”

Some of these are paradoxes that you manage. These are relational dilemmas, they are not problems that you solve.

Tech chauvinism, for example. It is a way of thinking that sees technical solutions for every complex social problem. I say that many of these complex social problems don’t have a solution. They are just paradoxes that you will live with and find meaning in and make sense of.

Spiegelman: Esther, it is such a treat to get to talk to you. Thank you so much for being here.

Perel: Thank you. It’s a pleasure.

Thoughts? Email us at [email protected].

This episode of “The Opinions” was produced by Vishakha Darbha. It was edited by Alison Bruzek and Kaari Pitkin. Mixing by Pat McCusker. Video editing by Arpita Aneja. The postproduction manager is Mike Puretz. Original music by Carole Sabouraud, Isaac Jones and Pat McCusker. Fact-checking by Mary Marge Locker. Audience strategy by Shannon Busta and Kristina Samulewski. The director of Opinion Video is Jonah M. Kessel. The deputy director of Opinion Shows is Alison Bruzek. The director of Opinion Shows is Annie-Rose Strasser.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].

Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.

The post Esther Perel on Why A.I. Intimacy Feels Safe but Isn’t Real appeared first on New York Times.

A push to end a fractured approach to post-fire contamination removal
News

A push to end a fractured approach to post-fire contamination removal

by Los Angeles Times
January 28, 2026

The patchwork efforts to identify and safely remove contamination left by the 2025 Eaton and Palisades fires has been akin ...

Read more
News

5 Minutes That Will Make You Love Sondheim

January 28, 2026
News

Birkenstock Steps Into High Bridal Fashion

January 28, 2026
News

These Republican lies point to — and make worse — something dangerously rotten

January 28, 2026
News

Trump officials, Alex Pretti and the truth

January 28, 2026
I’m a tech career coach. Do these 2 things immediately after getting laid off — and avoid this common mistake when using AI.

I’m a tech career coach. Do these 2 things immediately after getting laid off — and avoid this common mistake when using AI.

January 28, 2026
Ecuador Objects After ICE Agent Tries to Enter Minneapolis Consulate

Ecuador Objects After ICE Agent Tries to Enter Minneapolis Consulate

January 28, 2026
Trump rattles off early-morning threat against foreign nation: ‘With speed and violence’

Trump rattles off early-morning threat against foreign nation: ‘With speed and violence’

January 28, 2026

DNYUZ © 2025

No Result
View All Result

DNYUZ © 2025