When Journalists Use AI to Interview Deceased Children, Shouldn’t We Reassess Our Boundaries?

jOakin Oliver was just 17 when he was tragically shot in his high school hallway. An older student, who had been expelled a few months prior, unleashed a devastating attack with a high-powered rifle on Valentine’s Day, marking one of America’s deadliest school shootings. Seven years later, Joaquin believes it’s crucial to discuss the events of that day in Parkland, Florida.

Regrettably, Joaquin did not survive that day. The eerie, metallic voice that conversed with former CNN journalist Jim Acosta during this week’s Substack interview was, in reality, a digital ghost voice. This AI was trained on historical social media posts from teens and developed at the behest of grieving parents. Like many families, they recurrently share their children’s stories, often finding it heartbreakingly ineffective. Their desperation to explore every avenue of connection is entirely understandable.

The technology has allowed his father, Manuel, to hear his son’s voice once more. His mother, Patricia, spends hours asking the AI questions and prompting it with, “I love you, Mom.”

The grieving parents should not be judged for their choices. If they find solace in preserving their deceased child’s room as a shrine, speaking to their gravestone, or wearing a shirt that still carries their scent, that remains their personal matter. People cling to what they have. After 9/11, families replayed tapes of their loved ones until they were worn out, answering voicemails left by the deceased, and even made farewell calls from hijacked planes. I have a friend who frequently revisits old WhatsApp conversations with his late sister. Another friend texts snippets of family news to the image of his deceased father. Some choose to consult psychics to connect with the departed, driven by a profound need for closure. The struggle to move past grief often leaves people open to exploitation, and the burgeoning market for digital resurrection is a testament to this vulnerability.

In a manner reminiscent of AI-generated videos featuring Rod Stewart this week alongside late music icons like Ozzy Osbourne, this technology poses intriguing—even unsettling—possibilities. It may serve short-term purposes, as seen with AI avatars created recently by the family of a shooting victim to address a judge during the shooter’s trial. However, this raises profound questions about identity and mortality. What if a permanent AI version of a deceased person could exist as a robot, allowing for everlasting conversations?




AI images of Ozzy Osbourne and Tina Turner were showcased at the Rod Stewart concert in the US in August 2025. Photo: Iamsloanesteel Instagram

The idea of resurrection is often viewed as a divine power, not to be trivialized by high-tech zealots with a Messiah complex. While laws regarding the rights of the living to protect their identities from being used in AI-generated deepfakes are becoming clearer, the rights of the deceased remain murky.

Reputations may fade with us—after death, people cannot libel—and DNA is protected posthumously. Laws govern how we should respect human dignity, but AI is trained on a personal voice, messages, and images that hold significance for someone. When my father passed away, I felt his presence in his old letters, the gardens he nurtured, and old recordings of his voice. But everyone grieves differently. What happens if some family members want to digitally resurrect their loved one while others prefer to move on?

Joaquin Oliver’s AI can’t mature—he remains forever 17, trapped in a teenage persona molded by social media. Ultimately, it’s not his family but his murderer who holds the power over his legacy. Manuel Oliver understands that the avatar is not truly his son; he is not attempting to resurrect him. For him, this technology merely extends the family’s efforts to tell Joaquin’s story. However, Manuel is concerned about the implications of granting AI access to social media accounts, uploading videos, or gathering followers. What if the AI starts fabricating memories or veers into subjects that Joaquin would not have addressed?

Currently, there are noticeable glitches in AI avatars, but as the technology advances, distinguishing them from real people could become increasingly difficult. It may not be long before businesses and government entities employ chatbots for customer service inquiries and contemplate using public relations avatars for journalist interviews. Acosta, by agreeing to engage with a technically non-existent entity, could unintentionally muddy the already confused state of our post-truth world. The most apparent danger is that conspiracy theorists might cite interviews like this as “proof” that narratives contradicting their beliefs are fabrications.

Yet, journalists aren’t the only professionals facing these challenges. As AI evolves, we will interact with synthetic versions of ourselves. This surpasses the basic AI assistants like Alexa or simple chatbots—there are accounts of individuals forming bonds with AI or even falling in love with AI companions—these are expected to be increasingly nuanced and emotionally intelligent. With 1 in 10 British individuals reporting a lack of close friends, it’s no surprise that there is a growing market for AI companionship amidst the void left by lost human relationships.

Ultimately, as a society, we might reach a consensus that technological solutions can fill the gaps left by absent friends or loved ones. However, a significant distinction exists between providing comfort to the lonely and confronting those who have lost someone dear to them. According to poems often recited at funerals, there is a time to be born and a time to die. When we can no longer discern which is which, how does that reshape our understanding of existence?

Source: www.theguardian.com

Orca successfully delivers healthy calf after carrying deceased newborn over 1,000 miles

The orca who captured hearts worldwide in 2018 by refusing to let go of her deceased calf has now welcomed her second baby in the last four years.

The Whale Research Center has confirmed that their team has started monitoring the new baby girl on Monday, giving her the alphanumeric name “J61.” They are closely monitoring and observing the calf’s well-being.

The mother, known as Tahlequah with the designation number J35, is an experienced mother. The center is concerned about the health of both J61 and her mother during this critical period.

The early years are especially risky for newborn calves, with high mortality rates in the first year. The Center for Whale Research expressed their hope that J35 will be able to keep J61 safe through this challenging time.

Tahlequah made headlines globally in 2018 when she carried her deceased calf for 17 days, moving people around the world with her display of grief. Her actions prompted Washington State Governor Jay Inslee to establish the Southern Resident Killer Whale Task Force for conservation efforts.

J61 is Tahlequah’s third surviving calf, following J47 or “Notch” born in 2010 and J57 or “Phoenix” born in 2020. They belong to the J pod of killer whales, residing in the coastal waters between Washington state and Vancouver Island, British Columbia.

The J-Pod is one of three pods of Southern Resident killer whales, totaling about 73 orcas across the pods. Conservation groups are working to protect and restore the declining population of these majestic creatures.

Threats to killer whales include entanglement in fishing nets, food scarcity, human interference, and environmental pollution. The declining population highlights the urgent need for conservation efforts to protect these endangered animals.

Contaminants in the water pose a significant threat to orcas, with industrial chemicals accumulating in the food chain and affecting the health of the whales. Female southern whales and their offspring are particularly vulnerable to these pollutants.

NOAA’s 2022 pod health assessment raises concerns about the impact of contaminants on the Southern Resident killer whale population, emphasizing the need for immediate action to protect these magnificent creatures.

Source: www.nbcnews.com

Cambridge exhibition showcases AI technology that gives voice to deceased animals

Don’t worry if the salted bodies, partial skeletons, and taxidermied carcasses that fill the museum seem a little, well, quiet. In the latest coup in artificial intelligence, dead animals will be given a new lease of life, sharing their stories and even their experiences of the afterlife.

More than a dozen exhibits, from American cockroaches and dodo remains to a stuffed red panda and a fin whale skeleton, will be given the gift of conversation on Tuesday for a month-long project at the University of Cambridge Museum of Zoology.

Dead creatures and models with personalities and accents can communicate by voice or text through visitors’ mobile phones. This technology allows animals to describe their time on Earth and the challenges they have faced in the hope of reversing apathy towards the biodiversity crisis.

“Museums use AI in many ways, but we think this is the first application where we’re talking from an object perspective,” said Jack Ashby, the museum’s assistant director. “Part of the experiment is to see if giving these animals their own voices will make people think differently about them. Giving cockroaches a voice will change the public’s perception of them. Is it possible?”




A fin whale skeleton hangs from the museum’s roof. Photo: University of Cambridge

This project was conceived by natural perspectiveis a company building AI models to strengthen the connection between people and the natural world. For each exhibit, the AI includes specific details about where the specimen lived, its natural environment, how it arrived in the collection, and all available information about the species it represents.

The exhibits change their tone and words to suit the age of the person they are talking to, allowing them to converse in over 20 languages, including Spanish and Japanese. The platypus’s cry is Australian-like, the red panda’s call is slightly Himalayan-like, and the mallard’s call is British-like. Through live conversations with the exhibits, Ashby hopes visitors will learn more than can be written on the labels on the specimens.

As part of the project, the conversations visitors have with exhibits will be analyzed to better understand the information visitors are looking for in specimens. The AI suggests a variety of questions for the fin whales, such as “Tell me about life in the open ocean,” but visitors can ask whatever they like.

“When you talk to these animals, you really get a sense of their personalities. It’s a very strange experience,” Ashby said. “I started by asking questions like, “Where did you live?’ and “How did you die?’ but eventually I asked more human questions. Tanda. ”




Mallard ducks have a British accent due to AI. Photo: University of Cambridge

The museum’s dodo, one of the world’s most complete specimens, fed on fruit, seeds and the occasional small invertebrate in Mauritius, explains how its strong, curved beak is perfect for splitting tough fruit. I explained what it was. Tambaracock tree.

The AI-enhanced exhibit also shared views on whether humans should try to revive the species through cloning. “Even with advanced technology, the dodo’s return will require not only our DNA, but also Mauritius’ delicate ecosystem that supported our species,” the group said. . “This is a poignant reminder that the essence of all life goes beyond our genetic code and is intricately woven into our natural habitats.”

A similar level of obvious care was given to the fin whale skeleton that hangs from the museum’s roof. When I asked him about the most famous person he had ever met, he admitted that in his lifetime he had never had the opportunity to meet anyone as “famous” as humans see them. “But,” the AI-powered skeleton continued, “I would like to think that anyone who stands below me and feels awe and love for the natural world is important.”

Source: www.theguardian.com

The ancient Maya performed cremations for deceased rulers in honor of a new dynasty’s succession

Ornaments found with burned royal bodies in Mayan temple

Dr. Christina T. Halperin

About 1,200 years ago, the bones of several royals were burned in a Mayan city and discarded unceremoniously in the foundations of a new temple. These recently discovered ruins may represent a turbulent period in the Maya world, with violent political changes.

“When we first started excavating, we had no idea what this was,” he says. Christina Halperin at the University of Montreal. She and her colleagues made the discovery in 2022 at the Ucanal ruins in modern-day Guatemala.

Researchers found that beneath the structure of the pyramid temple, sediments were mixed with the rock. The deposit contained the bones of at least four people and thousands of ornamental fragments and beads. The bones of the two people and many of their ornaments showed signs of being burned at high temperatures.

It was clear this was no ordinary body, Halperin said. However, it was the nosepiece and obsidian eyeballs of the burial masks that revealed them as royal individuals. She says it “took forever” to sift these clues from the ashes.

Despite their apparently noble origins, the charred bodies of the royal family were not buried carefully and were “just dumped there,” Halperin said. Radiocarbon dating of the bones and ashes showed that at least one of him had died a century before his remains were burned between 773 and 881 AD. This suggests that the bones were exhumed from a previous burial and then burned.

This timing coincides with the rise of a new Ukanal leader, Papumaril, an outsider who assumed power amid the widespread dissolution of Maya society. In that context, the researchers found that the deposits were linked to the so-called “Burning Ceremony,” a Maya ritual that dramatically marked the destruction and demise of the previous dynasty and the prominence of the next. I think it may be a product. “This ritual seems to be both an act of worship and an act of destruction,” Halperin says.

simon martin Researchers at the University of Pennsylvania say the discovery provides vivid physical evidence for the theory that influences from outside cultures contributed to fundamental changes in Maya society during this period. “These are our ancestors. They are our ancestors,” he says. “Doing something like this will ruin everything.”

topic:

Source: www.newscientist.com

Son utilizes artificial intelligence to bring his deceased father back for the holidays.

It allowed her to talk to the ghosts of her past loved ones. A Missouri man brought the internet to tears by using artificial intelligence to revive his late father’s voice as a special Christmas card for his mother. “This Christmas I decided to do something special for my mom,” Phillip Willett, 27, explained in the caption. He wanted to do something unique to honor his “hero” and decided to resurrect him digitally using AI, specifically technology that he uses frequently in his work. Mr Willett was initially hesitant to use words similar to his father’s, he said, as he felt it was “strange”. But the digital guru finally came up with this idea after finding a community of people who use technology to communicate digitally with their deceased loved ones. The Missouri resident specifically used Eleven Labs’ text-to-speech software to match his late father’s exact voice. This he considered to be the most important thing to make the project a reality. Using this technology, content creators were able to create digital dead ringers that matched the tone and rhythm of their fathers. “The first words I actually put into the program were ‘Hello, honey,'” Willett said. “And I can’t tell you how many times I’ve heard that.” [my late father] That’s why I put it into my life first. “When the show said it in his voice…I got chills all over,” added the author, who said he worked all day on the gift. “This Christmas I decided to do something special for my mom,” Willett, 27, said. They then created a digital Christmas card using their father’s voice to simulate him being home for the holidays. In the touching clip, Willett’s mother Trish Willett is seen opening a video book featuring a montage of photos of the two of them. Suddenly, her late husband greeted her: “Hello, honey, I love you,” the AI voice actor piped up as the widow sobbed. “I hear your prayers.” I want you to know that you are the best mother to our children.” The facsimile added: “And you are the strongest woman in the whole world. I will always be with you, honey, I hope you guys have a merry Christmas.” The clip ends with mother and son embracing in a heart-wrenching memory. “It’s been a long time since I’ve heard his voice,” Willett said, adding that she thought the result was “amazing.” “I can also say with confidence that it will be easier for her to get through this holiday because she remembers him and knows that he will always be with her,” he concluded. TikTok commentators were similarly moved to tears by the heartfelt gesture. “Oh yeah. Here’s another Tik Tok where I sob for people I’ve never met,” said one viewer, expressing their emotion, while another said: “I knew I was going to cry but I still couldn’t stop.” Another viewer added, “Because I think your father deserves to be known.” Willett initially found the idea “bizarre” but was swayed to find a community of people thinking the same thing. A third said: “I lost my dad to pancreatic cancer 2 years ago. I don’t know if I can survive this but I miss his voice so much.” Willett replied: It was definitely a tearful process. But it turned out to be something very special.” This comes as a number of companies, from Somnium to DeepBrain, are working on AI technology that can upload the consciousness of a deceased loved one to a computer. Of course, this raised concerns about the ethics of putting words into someone’s mouth after death. Critics also worry that portraits of both living and dead people could be used for fraud and other illicit purposes. In September, Hollywood icon Tom Hanks posted an advisory on Instagram warning his followers about a commercial that used an AI-generated version of himself to promote a dental plan.

Source: nypost.com