top of page

Deadbots: The Blurred Line Between Grief and AI Deception

Deadbots: The Blurred Line Between Grief and AI Deception
Deadbots: The Blurred Line Between Grief and AI Deception

As artificial intelligence continues to advance, a new product has emerged in the realm of digital immortality: the creation of AI-generated avatars that simulate deceased loved ones. While some may find comfort in the idea of interacting with a digital representation of a lost friend or family member, AI ethicists warn of the potential psychological and ethical risks associated with these so-called "deadbots" or "ghostbots."

In a recent study published in Philosophy and Technology, researchers Tomasz Hollanek and Katarzyna Nowaczyk-Basińska from Cambridge University's Leverhulme Center for the Future of Intelligence explored this uncharted territory using a strategy called "design fiction." The ethicists created three hyperreal scenarios illustrating the potential issues that could arise from the use of AI-generated avatars of the deceased.

One scenario depicted an adult user initially impressed by the realism of their deceased grandparent's chatbot, only to later receive unsettling advertisements in the style of their relative's voice. Another described a terminally ill mother creating a deadbot for her young son to help with the grieving process, but the AI's adaptations lead to suggestions of in-person meetings, causing psychological harm.

The researchers emphasize the need for safeguards to be implemented as soon as possible to prevent such outcomes. They suggest that companies developing these services should establish sensitive procedures for "retiring" an avatar and maintain transparency about how their services work through risk disclaimers. Additionally, they argue that "re-creation services" should be restricted to adult users and respect the mutual consent of both data donors and their data recipients.

As the "digital afterlife" industry continues to grow, with tech giants like Amazon exploring the potential for AI assistants to mimic deceased loved ones' voices, it is crucial to address the ethical implications of these developments. Hollanek and Nowaczyk-Basińska stress the importance of mitigating the social and psychological risks associated with digital immortality before it becomes a widespread reality.


If you or your organization would like to explore how AI can enhance productivity, please visit my website at You can also schedule a free 15-minute call by clicking here




Thanks for subscribing!

bottom of page