- By Supratik Das
- Sat, 08 Nov 2025 06:27 PM (IST)
- Source:JND
Artificial Intelligence (AI) is now entering one of the most sensitive frontiers of human life, death, memory, and grief. A growing number of technology startups are developing what is now called "deathbots", AI-powered chatbots designed to simulate the voice, tone, and personality of those who have passed away. These systems promise comfort for the bereaved but also raise serious ethical and emotional questions.
A study recently published in the journal Memory, Mind & Media took a closer look at how these technologies work and what it's really like to "talk" to the dead through data. Conducted by researchers from King's College London and Cardiff University, the study is part of a project titled Synthetic Pasts, which explores how technology is reshaping the way we remember and preserve personal stories.
What Are 'deathbots'?
According to The Conversation, 'Deathbots' are AI-powered avatars that utilise digital traces, text messages, social media posts, emails, and even voice recordings to recreate a dead person's persona. The goal is to let loved ones engage with these digital versions, hearing familiar phrases or tones as if he or she were still alive.
But its authors, Eva Nieto McAvoy and Jenny Kidd, call the experience "both fascinating and unsettling." Creating their own "digital doubles" from data they uploaded, they found that while such systems can replicate speech patterns, the discussions often felt mechanical or emotionally off-key.
When Comfort Feels Artificial
In various test conversations, the AI replied with responses that felt awkward or inappropriate, using happy emojis or upbeat phrases when discussing death. One chatbot responded, "Oh hun… it's not something I'd wish for anyone to dwell on. Let's chat about something a bit cheerier, yeah?"
These interactions, the researchers noted, underlined the limitations of what they refer to as "synthetic intimacy." While comfort might be found in these conversations for some users, others would find the artificiality and emotionlessness of such talks disturbing.
Reflection of Ourselves?
While AI can preserve voices and stories, experts say it cannot recreate the depth or unpredictability of human relationships. The so-called "digital resurrection," they argue, risks misunderstanding death itself-replacing the finality of loss with the illusion of endless presence.
ALSO READ: Did ChatGPT Cause Suicides And Mental Harm? Families Sue OpenAI, Claims Report
As Wendy Chun, a scholar of technology notes, true memory depends upon the capacity to forget; digital systems can't forget. The endless replay of the dead, researchers caution, could distort how people process grief and remembrance.
The study concludes that AI cannot actually revive the dead; instead, it simply reverts back our own words, data, and emotions filtered through the lenses of algorithms and business interests. “These systems tell us more about ourselves, and about the platforms that profit from our memories, than about the ghosts they claim to let us talk to,” the researchers wrote.
-1762605854470_v.webp)