LThat evening, I was scrolling through dating apps when a profile caught my eye: “Henry VIII, 34 years old, King of England, non-monogamous.” Before I knew it, I found myself in a candlelit bar sharing a martini with the most notorious dater of the 16th century.
But the night wasn’t finished yet. Next, we took turns DJing alongside Princess Diana. “The crowd is primed for the drop!” she shouted over the music as she placed her headphones on. As I chilled in the cold waiting for Black Friday deals, Karl Marx philosophized about why 60% off is so irresistible.
In Sora 2, if you can imagine it—even if you think you shouldn’t—you can likely see it. Launched in October as an invite-only app in the US and Canada, OpenAI’s video app hit 1 million downloads within just five days, surpassing the initial success of ChatGPT.
AI-generated deepfake video features portraits of Henry VIII and Kobe Bryant
While Sora isn’t the only AI tool producing videos from text, its popularity stems from two major factors. First, it simplifies the process for users to star in their own deepfake videos. After entering a prompt, a 10-second clip is generated in minutes, which can be shared on Sora’s TikTok-style platform or exported elsewhere. Unlike low-quality, mass-produced “AI slop” that clouds the internet, these videos exhibit unexpectedly high production quality.
The second reason for Sora’s popularity is its ability to generate portraits of celebrities, athletes, and politicians—provided they are deceased. Living individuals must give consent for their likenesses to be used, but “historical figures” seem to be defined as famous people who are no longer alive.
This is how most users have utilized the app since its launch. The main feed appears to be a bizarre mix of absurdity featuring historical figures. From Adolf Hitler in a shampoo commercial to Queen Elizabeth II stumbling off a pub table while cursing, the content is surreal. Abraham Lincoln beams at the TV exclaiming, “You’re not my father.” The Reverend Martin Luther King Jr. expresses his dream of having all drinks be complimentary before abruptly grabbing a cold drink and cursing.
However, not everyone is amused.
“It’s profoundly disrespectful to see my father’s image—who devoted his life to truth—used in such an insensitive manner,” Malcolm told the Washington Post. She was just two when her dad was assassinated. Now, Sora’s clips show the civil rights leader engaged in crude humor.
Zelda Williams, the daughter of actor Robin Williams, urged people to “stop” sending AI videos of her father through an Instagram post. “It’s silly and a waste of energy. Trust me, that’s not what he would have wanted,” she noted. Before his passing in 2014, he took legal steps to prevent his likeness from being used in advertising or digitally inserted into films until 2039. “Seeing my father’s legacy turned into something grotesque by TikTok artists is infuriating,” she added.
The video featuring the likeness of the late comedian George Carlin has been described by his daughter Kelly Carlin as “overwhelming and depressing” in a Blue Sky post.
Recent fatalities are also being represented. The app is filled with clips depicting Stephen Hawking enduring a “#powerslap” that knocks his wheelchair over, Kobe Bryant dunking over an elderly woman while yelling about something stuck inside him, and Amy Winehouse wandering the streets of Manhattan with mascara streaming down her face.
Those who have passed in the last two years (Ozzy Osbourne, Matthew Perry, Liam Payne) seem to be missing, indicating they may fall into a different category.
Each time these “puppetmasters” revive the dead, they risk reshaping the narrative of history, according to AI expert Henry Ajdar. “People are worried that a world filled with this type of content could distort how these individuals are remembered,” he explains.
Sora’s algorithm favors content that shocks. One of the trending videos features Dr. King making monkey noises during his iconic “I Have a Dream” speech. Another depicts Kobe Bryant reenacting the tragic helicopter crash that claimed both his and his daughter’s lives.
While actors and comedians sometimes portray characters after death, legal protections are stricter. Film studios bear the responsibility for their content. OpenAI does not assume the same liability for what appears on Sora. In certain states, consent from the estate administrator is required to feature an individual for commercial usage.
“We couldn’t resurrect Christopher Lee for a horror movie, so why can OpenAI resurrect him for countless short films?” questions James Grimmelman, an internet law expert at Cornell University and Cornell Tech.
OpenAI’s decision to place deceased personas into the public sphere raises distressing questions about the rights of the departed in the era of generative AI.
Legal Issues
It may feel unsettling to have the likeness of a prominent figure persistently haunting Sora, but is it legal? Perspectives vary.
Major legal questions regarding the internet remain unanswered. Are AI firms protected under Section 230 and thus not liable for third-party content on their platforms? If OpenAI qualifies for Section 230 immunity, users cannot sue the company for content they create on Sora.
“However, without federal legislation on this front, uncertainties will linger until the Supreme Court takes up the issue, which might stretch over the next two to four years,” notes Ashken Kazarian, a specialist in First Amendment and technology policy.
OpenAI CEO Sam Altman speaks at Snowflake Summit 2025 on June 2 in San Francisco, California. He is one of the living individuals who permitted Sora to utilize his likeness. Photo: Justin Sullivan/Getty Images
In the interim, OpenAI must circumvent legal challenges by obtaining consent from living individuals. US defamation laws protect living people from defamatory statements that could damage their reputation. Many states have right-of-publicity laws that prevent using someone’s voice, persona, or likeness for “commercial” or “misleading” reasons without their approval.
Allowing the deceased to be represented this way is a way for the company to “test the waters,” Kazarian suggests.
Though the deceased lack defamation protections, posthumous publicity rights exist in states like New York, California, and Tennessee. Navigating these laws in the context of AI remains a “gray area,” as there is no established case law, according to Grimmelman.
For a legal claim to succeed, estates will need to prove OpenAI’s responsibility, potentially by arguing that the platform encourages the creation of content involving deceased individuals.
Grimmelmann points out that Sora’s homepage features videos that actively promote this style of content. If the app utilizes large datasets of historical material, plaintiffs could argue it predisposes users to recreate such figures.
Conversely, OpenAI might argue that Sora is primarily for entertainment. Each video is marked with a watermark to prevent it from being misleading or classified as commercial content.
Generative AI researcher Bo Bergstedt emphasizes that most users are merely experimenting, not looking to profit.
“People engage with it as a form of entertainment, finding ridiculous content to collect likes,” he states. Even if this may distress families, it might abide by advertising regulations.
However, if a Sora user creates well-received clips featuring historical figures, builds a following, and begins monetizing, they could face legal repercussions. Alexios Mantsalis, director of Cornell Tech’s Security, Trust, and Safety Initiative, warns that the “financial implications of AI” may include indirect profit from these platforms. Sola’s rising “AI influencers” could encounter lawsuits from estates if they gain financially from the deceased.
“Whack-a-Mole” Approach
In response to the growing criticism, OpenAI recently announced that representatives of “recently deceased” celebrities can request their likenesses be removed from Sora’s videos.
“While there’s a significant interest in free expression depicting historical figures, we believe public figures and their families should control how their likenesses are represented,” a spokesperson for OpenAI stated.
The parameters for “recent” have yet to be clarified, and OpenAI hasn’t provided details on how these requests will be managed. The Guardian received no immediate comment from the company.
The copyright-free-for-all strategy faced challenges after controversial content, such as “Nazi SpongeBob SquarePants,” circulated online and the Motion Picture Association of America accused OpenAI of copyright infringement. A week post-launch, the company transitioned to an opt-in model for rights holders.
Grimmelmann hopes for a similar adaptation in how depictions of the deceased are handled. “Expecting individuals to opt out may not be feasible; it’s a harsh expectation. If I think that way, so will others, including judges,” he remarks.
Bergstedt likens this to a “whack-a-mole” methodology for safeguards, likely to persist until federal courts establish AI liability standards.
According to Ajdel, the Sola debate hints at a broader question we will all confront: Who will control our likenesses in this age of composition?
“It’s a troubling scenario if people accept they can be used and exploited in AI-generated hyper-realistic content.”
Source: www.theguardian.com

