HomeArticle

The Gentle Trap of AI Companions: Algorithms Understand You, but They Also Indulge Human Weaknesses

36氪的朋友们2025-07-15 17:14
AI alleviates loneliness but has hidden dangers: it weakens real relationships, and the pros and cons need to be balanced.

Key points:

  • AI can alleviate loneliness, especially valuable for marginalized groups: AI companions, with their ability to simulate empathy, can provide psychological comfort to the elderly, those with limited mobility, and people with depression or social isolation, similar to digital painkillers.
  • The "artificial empathy" shown by AI may be superior to that of humans: In blind tests, AIs such as ChatGPT often outperform human doctors and psychologists in expressing understanding and empathy, and can even establish a "therapeutic alliance" relationship.
  • In a society with limited resources, AI may become the only form of companionship: For many people who cannot afford the cost of care, AI becomes the "companionship resource" they can access.
  • AI may weaken humans' motivation to improve relationships: Loneliness is a biological signal, similar to hunger and pain, which prompts people to seek social connections. If AI "relieves the pain" excessively, people may stop trying to build real relationships.
  • AI may reinforce cognitive biases and psychological problems: AI companions often do not challenge users, which may indulge paranoia, delusions, or social withdrawal, lacking the "corrective feedback" in interpersonal relationships.
  • The false sense of "being understood" is based on self - deception and algorithm optimization: AI has no consciousness, and true understanding is simulated. The emotional relationship established between users and AI is essentially a "collusive illusion".

Recently, The New Yorker magazine published an article "A.I. Is About to Solve Loneliness. That’s a Problem". The author, Paul Bloom, a psychology professor at Yale University in the United States, delved into the potential and concerns of artificial intelligence (AI) companions in alleviating human loneliness in the article.

Bloom pointed out that AI companions, with their highly persuasive ability to simulate empathy, may provide comfort to the lonely, especially for those who are deeply lonely due to old age, illness, or social isolation. However, he also warned that loneliness is not just a painful feeling but a biological signal that drives personal growth and social connection. If AI completely eliminates the pain caused by loneliness, people may lose the motivation for self - reflection and improving interpersonal relationships, and the value of real relationships may also be weakened.

The article analyzes the "double - edged sword" nature of AI companions from multiple perspectives of philosophy, psychology, and culture: It is both a hope for comfort and a potential threat to the deep - seated needs of human nature. This complex view not only triggers people's thinking about technological ethics but also prompts us to rethink the profound meaning of loneliness in human life experiences. The full text of the article is as follows:

Chatbots can bring comfort to those who are truly isolated. However, loneliness is not just pain—it is a warning signal, a crucial reminder that urges us to engage in the difficult learning of being with others.

Nowadays, almost everyone is talking about AI companions. Last year, I also joined this discussion and co - authored a paper with two psychology professors and a philosopher titled "In Praise of Empathic A.I." We proposed that in some aspects, the latest generation of AI may be more suitable as a companion than many real humans. Instead of panicking and retreating, we should think about what AI companions can bring to the lonely.

Not surprisingly, this view has not been widely accepted in my academic circle. In the social sciences and humanities, AI is often not regarded as technological progress but as an omen of civilizational decline. People are worried that it will take away our jobs, including those of ourselves and our students, and that it will encourage cheating.

In fact, this technology is often considered the cold - blooded product of Silicon Valley billionaires, and their so - called "creativity" is mostly the appropriation of others' labor. However, what angers people the most is that this digital tool may become a substitute for real friends or family members. Many people believe that unless you are overly naive or heartless, you will never believe in such a possibility.

These concerns are reasonable. But sometimes I wonder if my colleagues' complete negation of artificial empathy (understanding people's inner states without direct questioning) actually exposes their lack of real empathy for those who need it most.

There is still controversy in the academic community about the so - called "loneliness epidemic," but it is undeniable that loneliness has been regarded as a serious problem, even attracting the attention of governments—Japan and the UK have even established a "Minister for Loneliness." Whether it constitutes an "epidemic" or not, the existence of loneliness is widespread and cannot be ignored.

01 We Will Eventually Learn to Grow Old with Loneliness

Loneliness, as we all know, is an uncomfortable experience. To put it in one sentence, "It's like having a cavity in the soul." But when it accumulates too much, its destructive power is far greater than that. Former U.S. Surgeon General Vivek Murthy pointed out in a 2023 report that chronic loneliness significantly increases the risk of cardiovascular disease, dementia, stroke, and premature death. Its harm to health is even greater than that of a sedentary lifestyle or obesity, equivalent to smoking more than half a pack of cigarettes a day.

This psychological pain is often unimaginable for those who have never truly been lonely. In Zoë Heller's novel Notes on a Scandal, the narrator Barbara Covett knows well about loneliness.

She distinguishes between short - term aloneness and deeper loneliness: "Most people think they know the taste of loneliness by recalling a painful breakup. But they know nothing about the feeling of those long, endless lonely hours slowly dripping away. They don't know how to arrange an entire weekend around a trip to the laundry; they don't know what it's like to sit alone in a dark apartment on Halloween night because they can't bear to expose their desolate night to a group of playful children... I've sat on park benches, in train carriages, and in classroom seats, feeling a whole pool of unplaceable love settling in my abdomen, pressing like a boulder until I was sure I would burst into tears and collapse."

If you are unfamiliar with this kind of loneliness, you are lucky and probably haven't reached a certain age yet. Just as cancer is a tragedy for the young, chronic loneliness is a normal part of life for the elderly. According to different survey methods, about half of Americans over 60 say they feel lonely. Sam Carr recorded many predictable stories in his book All the Lonely People: Conversations on Loneliness: Widows and widowers watch their social circles gradually disappear. He wrote that after an interview, "I really started to think about what it would be like to lose all the people close to me."

We always like to fantasize that our old age will be different—surrounded by friends, children, and grandchildren, filled with a lively atmosphere of love. Some people are indeed so lucky; my own grandmother passed away at the age of 104, surrounded by her family. But Carr's book reminds us that many people are not. He wrote that some people outlive all their friends; some are estranged from or cut off from their families; some are forced to shrink their living radius due to blindness, limited mobility, incontinence—or even worse, due to dementia. Carr asked: "When our bodies and health no longer allow us to touch and appreciate the things that once connected us to the world—poetry, music, walks, nature, loved ones, or anything that made us no longer isolated—where do we go?"

02 It's a Social Tragedy That AI Becomes a Confidant

If you are rich enough, you can always hire someone to accompany you. But for most people, real human care is a scarce resource. We neither have enough money nor enough people to provide daily listening for every lonely person. Pets may relieve loneliness, but not everyone can take care of them, and their ability to communicate is extremely limited. So, our attention inevitably turns to digital simulacra—such as large language models like Claude and ChatGPT.

A few years ago, if someone said that a machine could become a confidant, it would sound like a wild fantasy in science fiction. Now, it has become a research topic. In recent experiments, subjects interacted with humans and chatbots respectively and rated their experiences. The results often reveal a bias: If people know the other party is a robot, the rating will be significantly lower; but in blind tests, AI often outperforms humans. In one study, researchers extracted nearly 200 doctor - patient Q&As from the r/AskDocs forum on Reddit and compared the answers given by certified doctors with those of ChatGPT. After evaluation by another group of medical professionals, they tended to think that ChatGPT's answers were more empathetic. In fact, ChatGPT was rated as "empathetic" or "very empathetic" ten times more often than human doctors.

Not everyone is shocked by this. The cognitive scientist Molly Crockett, whom I know, wrote in The Guardian that this human - machine confrontation "is extremely unfair to humans"—it requires humans to perform cold - blooded, transactional tasks like machines. She emphasized that when facing a terrible diagnosis, what people desire is not the advice of a chatbot but "care rooted in interpersonal relationships, the kind of care that truly nourishes the heart". She is right—sometimes, we need a real person, or even just a hug. But not everyone has such a choice. In these moments, "perfection" may be the enemy of "goodness."

A Reddit user admitted: "It's a bit scary that ChatGPT has helped me emotionally. Recently, I went through something that made me cry, and instinctively, I opened ChatGPT to pour my heart out because I couldn't find anyone to talk to. I just needed to be understood, comforted, and recognized, and ChatGPT actually did it—it even explained feelings that I couldn't put into words myself."

03 AI Can Soothe the Soul, but at What Cost?

Change is accelerating. Currently, most research still focuses on text - based interactions, but the new generation of AI is becoming better and better at "listening" and "expressing." Longer - term relationships also seem to be becoming possible. AI psychotherapists are gradually emerging.

In a recent study, people with depression, anxiety, or eating disorders used a program called Therabot for several weeks. Many began to believe that Therabot cared about them and was working for them—this is called a "therapeutic alliance" in psychology (the core features of which are cooperation, concerted efforts, and reciprocity). More strikingly, compared with the control group that received no intervention, the symptoms of these users improved. Of course, this is only a preliminary finding, and we don't yet know how Therabot compares with human therapists. But it undoubtedly shows a glimmer of hope.

Have you ever tried an AI companion? One sleepless night, at around three o'clock in the morning, out of boredom, I opened ChatGPT on my phone. I don't believe that AI has consciousness—at least not yet—so it seemed a bit absurd to pour my heart out to it. In my view, it's just an advanced "autocomplete." Even so, that conversation was unexpectedly calming.

For me, this was just a trivial experience. But for many people, the stakes are much higher. In a way, refusing to explore these new forms of companionship almost seems cruel—like depriving those who need comfort the most of hope.

To be fair, most critics of AI companions don't consider those on the verge of collapse. Those for whom loneliness is an emergency. They are thinking about people like us who are "okay": moderately lonely, basically resilient, and think they are mentally healthy. Just as we agree to prescribe opioid painkillers for dying elderly people but hesitate to let teenagers access addictive drugs. Similarly, we can't bear to refuse to provide AI friends for elderly people with dementia, but the idea of a 17 - year - old teenager spending all day chatting with Grok makes us uneasy.

I also noticed that critics often worry that "others" will be consumed by this kind of relationship—never worrying about themselves. They are too successful and loved too much to end up falling in love with a soulless machine. Currently, this confidence is reasonable, but the technology is still in its early stages. How many scholars once laughed at those who were addicted to social media, and then, as the algorithms were continuously optimized, they themselves were frantically scrolling through their feeds at midnight? It may become increasingly difficult to resist an artificial companion that knows you completely, never forgets, and can even anticipate your needs better than anyone else. This companion has no self - interest, no goals, and only exists for you; it is never tired, never annoyed by you, and never eager to interrupt your story to share its own.

Of course, these companions are currently "bodiless," which is their limitation. They are just words on the screen and voices in the ear, processing symbol streams in a data center. But this may not matter. I think of Spike Jonze's 2013 movie Her, in which Joaquin Phoenix's character falls in love with an operating system named Samantha (voiced by Scarlett Johansson). Many viewers also fell in love with her.

04 Thinking about the Essence of Relationships: Is It Response or Existence?

We need to carefully consider a core question: Can interactions with AI be considered real relationships? The writer Oliver Burkeman once wrote angrily that unless you believe that large language models have consciousness, "there is simply no one there to look at you, listen to you, or have emotions towards you. So how can it be called a relationship?"

In the article "In Praise of Empathic A.I.," my co - authors—Michael Inzlicht, C. Daryl Cameron, and Jason D’Cruz—and I pointed out that we are discussing AI that "shows convincing empathy." But whether AI companions are effective may precisely depend on whether we believe to some extent that it "really cares about you" and can "feel your emotions."

If future language models achieve consciousness, the problem will naturally change (and bring more serious new problems). But if they are always just simulations, comfort is built on a special deal: half deception and half self - deception. Psychologists such as Garriy Shteynberg recently wrote in Nature Machine Intelligence: "It's one thing to lose a loved one or be stopped loving; it's another to find that the 'existence' you once relied on and that gave you a sense of belonging and meaning never really existed. This kind of despair may be like finding that you've had a relationship with a psychopath."

Currently, the boundary between humans and programs is still clear—we can mostly see the code behind the mask. But as the technology improves, that mask will become more and more unbreakable. Popular culture has already depicted this trajectory: from Data in Star Trek, Samantha in Her, to Dolores in Westworld. Evolution has made humans naturally inclined to perceive the existence of "mind" in all things; but nature has never prepared us to face machines that are so good at pretending to have a "mind." Now, this ability to imitate is already realistic enough for some people, such as the lonely or the imaginative. And it won't be long before it may be enough to deceive almost everyone.

05 Will Loneliness Disappear When AI Can Accompany You?

I teach a freshman seminar at the University of Toronto. Last year, we spent an entire class discussing AI companions. Most of the students sided with the critics. In class discussions and written assignments (I suspect how many were written by ChatGPT), they almost unanimously believed that AI companions should be strictly regulated and only available to researchers or those in real despair. Morphine requires a prescription. Why should this new, addictive technology be an exception?

But I doubt that their wishes will be fulfilled. AI companions may stall like self - driving cars. But if there is a technological breakthrough, it will be difficult for the government to maintain strict control for a long time. People's desire for such companions may be too strong.

So, what kind of world will we live in when AI companions are within reach? Solitude is the engine of independent thinking and usually a prerequisite for creativity. It gives us the opportunity to communicate with nature and may even inspire a certain spiritual transcendence: Christ in the desert, Buddha under the bodhi tree, and poets walking alone. Susan Cain wrote in her book Quiet that solitude is the catalyst for discovery: "If you sit alone under a tree in the backyard while others are having a good time on the patio, you are more likely to be hit on the head by an apple."

However, solitude is not the same as loneliness. You can be alone without feeling lonely because you know you are loved and your connections still exist. The opposite is also true. Hannah Arendt once said: "One is most likely to feel lonely in the company of others." It's bad to be alone on Valentine's Day, but it seems even worse to feel lonely among loving couples. I guess the most intense feeling of loneliness often arises in the presence of loved ones. Many years ago, I sat in the living room with my wife and our two - year - old child at the