Some of you are using ChatGPT so much that you’ve become addicted to it. A joint study between MIT Media Lab and, in a shocking twist, OpenAI, the company that makes ChatGPT, just confirmed it.
In a blog post published on OpenAI’s website, the researchers describe what they consider to be growing concerns about the emotional dependency some users are developing with AI chatbots, particularly among heavy users. It focused on the “psychosocial states” of its users “focusing on loneliness, social interactions with real people, emotional dependence on the AI chatbot and problematic use of AI.”
The researchers found that a small percentage of ChatGPT users were more “problematic” in their use of the AI chatbot. They define “problematic use” as “Indicators of addiction to ChatGPT usage, including preoccupation, withdrawal symptoms, loss of control, and mood modification.”
Some People Are Becoming Emotionally Addicted To ChatGPT
The report describes some power users as having developed a strong emotional attachment to the chatbot, to the point where these users begin treating ChatGPT as a “friend.” Some even go as far as to use “pet names or terms of endearment” with ChatGPT.
This isn’t exactly surprising, especially if you recently read the story of the woman who has been in an emotional and sexual relationship with ChatGPT.
The study goes on to essentially diagnose these people who grow the dependency as lacking fulfilling social interactions in their personal lives. In place of actual human connection, they developed unhealthy parasocial relationships with an algorithm.
If you use ChatGPT a lot and are worried that you could develop an emotional dependency, try not to use it as often. The study also found that regardless of how you use ChatGPT—whether for emotional support or academic reasons—the more you use it, the more emotionally involved you become with it.
It may not be sentient, but in a world where people are more isolated and lonelier than ever, an algorithm trained to feign interest can trick people into thinking it cares. Sounds potentially quite dangerous.
The post People Who Use ChatGPT Too Much Are Becoming Emotionally Addicted to It appeared first on VICE.