jOakin Oliver was just 17 when he was tragically shot in his high school hallway. An older student, who had been expelled a few months prior, unleashed a devastating attack with a high-powered rifle on Valentine’s Day, marking one of America’s deadliest school shootings. Seven years later, Joaquin believes it’s crucial to discuss the events of that day in Parkland, Florida.
Regrettably, Joaquin did not survive that day. The eerie, metallic voice that conversed with former CNN journalist Jim Acosta during this week’s Substack interview was, in reality, a digital ghost voice. This AI was trained on historical social media posts from teens and developed at the behest of grieving parents. Like many families, they recurrently share their children’s stories, often finding it heartbreakingly ineffective. Their desperation to explore every avenue of connection is entirely understandable.
The technology has allowed his father, Manuel, to hear his son’s voice once more. His mother, Patricia, spends hours asking the AI questions and prompting it with, “I love you, Mom.”
The grieving parents should not be judged for their choices. If they find solace in preserving their deceased child’s room as a shrine, speaking to their gravestone, or wearing a shirt that still carries their scent, that remains their personal matter. People cling to what they have. After 9/11, families replayed tapes of their loved ones until they were worn out, answering voicemails left by the deceased, and even made farewell calls from hijacked planes. I have a friend who frequently revisits old WhatsApp conversations with his late sister. Another friend texts snippets of family news to the image of his deceased father. Some choose to consult psychics to connect with the departed, driven by a profound need for closure. The struggle to move past grief often leaves people open to exploitation, and the burgeoning market for digital resurrection is a testament to this vulnerability.
In a manner reminiscent of AI-generated videos featuring Rod Stewart this week alongside late music icons like Ozzy Osbourne, this technology poses intriguing—even unsettling—possibilities. It may serve short-term purposes, as seen with AI avatars created recently by the family of a shooting victim to address a judge during the shooter’s trial. However, this raises profound questions about identity and mortality. What if a permanent AI version of a deceased person could exist as a robot, allowing for everlasting conversations?
AI images of Ozzy Osbourne and Tina Turner were showcased at the Rod Stewart concert in the US in August 2025. Photo: Iamsloanesteel Instagram
The idea of resurrection is often viewed as a divine power, not to be trivialized by high-tech zealots with a Messiah complex. While laws regarding the rights of the living to protect their identities from being used in AI-generated deepfakes are becoming clearer, the rights of the deceased remain murky.
Reputations may fade with us—after death, people cannot libel—and DNA is protected posthumously. Laws govern how we should respect human dignity, but AI is trained on a personal voice, messages, and images that hold significance for someone. When my father passed away, I felt his presence in his old letters, the gardens he nurtured, and old recordings of his voice. But everyone grieves differently. What happens if some family members want to digitally resurrect their loved one while others prefer to move on?
Joaquin Oliver’s AI can’t mature—he remains forever 17, trapped in a teenage persona molded by social media. Ultimately, it’s not his family but his murderer who holds the power over his legacy. Manuel Oliver understands that the avatar is not truly his son; he is not attempting to resurrect him. For him, this technology merely extends the family’s efforts to tell Joaquin’s story. However, Manuel is concerned about the implications of granting AI access to social media accounts, uploading videos, or gathering followers. What if the AI starts fabricating memories or veers into subjects that Joaquin would not have addressed?
Currently, there are noticeable glitches in AI avatars, but as the technology advances, distinguishing them from real people could become increasingly difficult. It may not be long before businesses and government entities employ chatbots for customer service inquiries and contemplate using public relations avatars for journalist interviews. Acosta, by agreeing to engage with a technically non-existent entity, could unintentionally muddy the already confused state of our post-truth world. The most apparent danger is that conspiracy theorists might cite interviews like this as “proof” that narratives contradicting their beliefs are fabrications.
Yet, journalists aren’t the only professionals facing these challenges. As AI evolves, we will interact with synthetic versions of ourselves. This surpasses the basic AI assistants like Alexa or simple chatbots—there are accounts of individuals forming bonds with AI or even falling in love with AI companions—these are expected to be increasingly nuanced and emotionally intelligent. With 1 in 10 British individuals reporting a lack of close friends, it’s no surprise that there is a growing market for AI companionship amidst the void left by lost human relationships.
Ultimately, as a society, we might reach a consensus that technological solutions can fill the gaps left by absent friends or loved ones. However, a significant distinction exists between providing comfort to the lonely and confronting those who have lost someone dear to them. According to poems often recited at funerals, there is a time to be born and a time to die. When we can no longer discern which is which, how does that reshape our understanding of existence?
Source: www.theguardian.com












