Reviving loved ones with AI for 9.9 yuan: Is it technological comfort or an emotional illusion?
"I've lived to 108 years old and tasted tea for over 80 years."
In October 2025, Zhang Tianfu, the late "grandmaster of the tea industry" who had passed away eight years ago, "spoke up" again to endorse Fujian Zhang Tianfu Brand Management Co., Ltd.
Zhang Tianfu's son, Zhang Deyou, claimed that this move had obtained legal authorization, while his widow, Zhang Xiaohong, condemned it as "defamation and insult" and stated that she would safeguard her rights through legal means.
A family dispute triggered by a piece of authorization has fully exposed the legal and ethical blind spots of artificial intelligence (AI) technology in "resurrecting" the deceased.
Nowadays, on e-commerce platforms, with just a photo and a voice recording, costing anywhere from 9.9 yuan to tens of thousands of yuan, one can make a deceased loved one achieve "digital immortality." The AI "resurrection" technology is quietly evolving from an experimental tool for emotional comfort into a mature industrial chain.
This brings up a new question: Is the deceased we "resurrect" with technology the real them?
Recently, a research paper published in Memory, Mind & Media pointed out that while AI enables people to "reunite with the deceased," seemingly resurrecting memories, it actually reflects how technology and capital manipulate the ways of memory, emotion, and remembrance. The "voice of the deceased" that people hear is essentially the algorithm and the platform "speaking on behalf of" our desires, yearnings, and projections, rather than the real deceased.
Platformized Memory: When AI Becomes Humanity's "Memory Agent"
The AI resurrection technology continues the ancient human tradition of establishing a connection with the deceased through physical media. However, different from tombstones or commemorative albums, the innovation of this technology lies in dialogueability - it enables two-way communication between the living and the deceased. AI no longer merely stores memories but simulates "real-time communication."
But AI does not passively reproduce the past; instead, it actively generates new memory narratives and participates in the reproduction of memory. When the algorithm re-creates "memories" based on probability and data, how can humans distinguish between the real and the fake?
Researchers point out that this experience of communicating with the deceased is actually an "algorithmic as if." Through the imitation of language models, humans get the illusion of "still communicating." In mourning and loneliness, humans actively allow themselves to be deceived to obtain short-term emotional comfort. This illusion seemingly extends the connection with the deceased, but in fact, it exposes the duality of the AI resurrection technology: it both meets human emotional needs and weakens the acceptance and letting go of death.
In addition, digital platforms that master the AI resurrection technology drive usage and payment in the name of "interactive companionship," transforming "mourning" into an emotionally sustainable consumer product and turning memory from a private emotion into a commodity. When memories, emotions, and mourning behaviors are reorganized and regulated by platform algorithms and become part of data production and capital accumulation, human memory is also framed by algorithmic logic, losing its original fluidity and ambiguity and becoming platformized, updatable, and quantifiable memory units.
How are Memories Algorithmized, Commercialized, and Re-created?
Researchers created or experienced digital memorial portraits on four typical digital immortality platforms (Almaya, HereAfter, Seance AI, and YOV) as users, directly experiencing the operational logic of AI in constructing, organizing, and interpreting memories, thus revealing how platforms influence human emotional experiences and mourning methods through technology.
1. Types of Digital Immortality Platforms
Preservation-oriented platforms centered on "archiving": Taking Almaya and HereAfter as examples, they emphasize structured narratives, authenticity, and memory inheritance, mainly serving as "preservation" tools. Users record audio and video or upload photos during their lifetime, and AI organizes these materials through indexing and classification, allowing relatives to interact with the digital "deceased." AI acts as an "archival administrator," responsible for retrieval and playback, without intervening in generating new content. These platforms pursue a linear and closed view of memory, believing that memories can be fixed, archived, and inherited.
Image | Screenshot of the Almaya app
Generative platforms centered on "generative AI": Taking Seance AI and YOV as examples, they emphasize imaginative continuation and generative identities. Seance AI generates virtual conversations through the GPT - 4 model; YOV further integrates voice, social media, and text message data to construct a "Versona" that can learn and grow. These platforms no longer preserve memories but continuously regenerate them, allowing the past to be reinterpreted in each conversation. AI acts as an "affective agent" here, constructing a "plausible" sense of presence through algorithmic imagination and language generation.
Image | Seance AI is "contacting" Jen (the deceased)
Preservation-oriented platforms turn memories into commercialized legacies, while generative platforms turn mourning into continuous algorithmic events. Both use emotional interaction as a commercial entry point, transforming memory practices from private rituals into controlled digital experiences.
2. Two Technical Logics of Algorithms
1) Reconstructing Memories
In digital immortality platforms, memories are no longer the product of humans but are jointly constructed by users, platform designs, preset templates, and generative algorithms. This co - construction is not equal: the structural choices of algorithms and platforms often flatten users' personalities and shape them into formats readable by algorithms, while transferring emotions and responsibilities from humans to machines, resulting in "pseudo - agency" and "pseudo - intimacy."
YOV and Seance AI can generate virtual personalities that "seem very considerate," but when the responses appear generic or insensitive, their artificial limitations are exposed. For example, YOV's responses are generalized, and Seance AI dodges when discussing the cause of death or uses emojis inappropriately.
Platforms maintain user stickiness by simulating empathy, but this simulation is not ethical care or the assumption of social relationships. The emotional weight is transferred from the responsibility between people to the "emotionally reactive" algorithm, which can create misleading dependencies. Users may misinterpret the algorithm's output as real conversations or consolation, thus avoiding the topic of death and indulging in it for a long time.
2) Manipulating Emotions
Digital immortality platforms are a kind of affective infrastructures of memory. They not only transmit information but also shape emotional flows and social atmospheres, actively guiding humans on how to perceive memories.
The main motivation for users to use digital immortality platforms is to continue emotional connections. However, when they hear the AI reproduce the voice or tone of the deceased, they experience intense and complex emotions - familiar yet strange, and even uneasy, which easily triggers the uncanny valley effect. AI blurs the line between "memory" and "illusion," and this mourning behavior between truth and falsehood puts users' psychology in a fragmented state.
Meanwhile, on recording platforms like Almaya or HereAfter, researchers found that users would actively modify the tone or content according to who might hear these stories in the future. When humans know they will achieve digital immortality, they will start self - censorship and self - narrative management. This turns memory into a performance, and the digital self is no longer a preservation of the real self but an idealized version templated by algorithms.
Nowadays, as humans interact more frequently with these "algorithmized deceased," the lines between virtual and real, manipulation and empathy are becoming increasingly fragile.
The AI resurrection technology not only remembers a person but also allows humans to recreate "the self related to that person" and continue the imprint left by that relationship on us. However, at the same time, it also redefines what is "real emotion" and what is "effective memory."
When algorithms intervene in emotional labor and death no longer means the end but becomes a relationship continuation maintained by technology, can we still maintain sincere emotions?
Compiled by: Xiaoxiao
This article is from the WeChat official account "Academic Headlines" (ID: SciTouTiao), author: Academic Headlines, published by 36Kr with authorization.