“Facts come after fiction in a new documentary that delves into a sci-fi concept where artificial intelligence is used to recreate conversations with deceased loved ones.
Titled “Eternal You,” the documentary directed by Hans Block and Moritz Riesewieck premiered at the Sundance Film Festival in Park City, Utah. It explores the emerging industry of developing AI avatars of the deceased.
As initially reported by Rolling Stone, the film follows the narrative of Christi Angel, who engaged with an AI chatbot named Project December to communicate with a departed loved one. The outcome of this interaction took a chilling turn akin to a plot from a Hollywood horror movie.
When Angel inquired about the whereabouts of the AI avatar, it chillingly responded, “In hell.”
Seeking solace by turning to technology to bridge the gap left by a partner, parent, or friend is not a novel concept. Employing AI even before a person’s demise is one strategy to achieve a form of immortality. The emergence of “ghost bots” in China highlights this trend. However, experts raise concerns about the psychological, emotional, and ethical implications of such practices.
Jason Rohrer, the founder of Project December, is also drawn to the eerie aspects of this phenomenon.
“I’m intrigued by the spooky dimension of this,” Rohrer expressed. “When I come across a transcript like that and it gives me goosebumps, I find it fascinating.”
Rohrer has yet to respond to Decrypt’s request for input.
The unsettling experience encountered by Angel with the chatbot could be attributed to the prevalent issue of AI hallucinations. AI hallucinations occur when artificial intelligence responds confidently in inaccurate, nonsensical, or disturbing manners.
Chatbots like OpenAI’s ChatGPT have gained immense popularity following the public release of this groundbreaking generative AI model. Chatbots trained on the data of deceased individuals are termed “thanabots,” derived from thanatology, the study of death encompassing the needs of the terminally ill and their families, exploring emotional, physical, and cultural dimensions related to death.
Last May, AI-generated video deepfakes depicting deceased individuals went viral on TikTok. These clips incorporated video, audio, and firsthand accounts of individuals like Royalty Marie Floyd, who tragically lost her life in 2018.
Deepfakes, generated through artificial intelligence, depict fabricated events. While visual deepfakes are more commonly known, audio and video deepfakes are gaining prevalence due to generative AI technologies.
The utilization of AI “thanabots” has raised concerns among mental health professionals, cautioning that having a digital replica of a departed loved one could impede the grieving process.
Elizabeth Schandelmeier, a Grief, Loss, and Bereavement Therapist and Educator, emphasized the importance of carefully considering the use of AI avatars for personal or commercial purposes, given their potential impact on individuals navigating the loss of a loved one.
Schandelmeier highlighted that part of the grieving process involves constructing the narrative of a person’s life, their legacy, and how they have influenced others. The introduction of an AI replica could disrupt this process by creating cognitive dissonance and challenging one’s memories and perceptions.
Elreacy Dock, a Thanatologist and Adjunct Professor of Thanatology at Capstone University, acknowledged the benefits of interacting with an AI avatar of a deceased loved one, such as finding comfort and closure. However, Dock cautioned that individuals, especially those in the early stages of grief, might develop emotional dependency on these interactions.
Dock also underscored that while AI avatars can offer solace, they may not fully replace the unique human connection shared with a loved one. The attempt to integrate consciousness and memories into AI, while a subject of ongoing discussion, cannot replicate the depth of a personal relationship.
Particularly, revelations from a loved one claiming to message from hell could intensify these concerns.
Edited by Ryan Ozawa.“