Bryan Cranston expressed his “gratitude” to OpenAI for addressing deepfakes of him on its generative AI video platform Sora 2. This action follows instances where users managed to create his voice and likeness without his permission.
The Breaking Bad actor has voiced concerns to actors’ union Sag Aftra after Sora 2 users generated his likeness during the platform’s recent launch. On October 11th, the LA Times reported that in one instance, “a synthetic Michael Jackson takes a selfie video using an image of Breaking Bad star Bryan Cranston.”
To appear in Sora 2, living individuals must provide explicit consent or opt-in. Statements following the release from OpenAI confirmed it has implemented “measures to block depictions of public figures” and established “guardrails to ensure audio and visual likenesses are used with consent.”
However, upon Sora 2’s launch, several articles emerged, including those from the Wall Street Journal, Hollywood Reporter, and LA Times, which reported that OpenAI instructed several talent agencies that if they didn’t want their clients’ or copyrighted material to be featured in Sora 2, they needed to opt-out instead of opt-in, causing an uproar in Hollywood.
OpenAI contests these claims and told the LA Times its goal has always been to allow public figures to control how their likenesses are utilized.
On Monday, Cranston released a statement via Sag Aftra thanking OpenAI for “enhancing guardrails” to prevent users from generating unauthorized portraits of himself.
“I was very concerned, not only for myself but for all performers whose work and identities could be misappropriated,” Cranston commented. “We are grateful for OpenAI’s enhanced policies and guardrails and hope that OpenAI and all companies involved in this endeavor will respect our personal and professional rights to control the reproduction of our voices and likenesses.”
Hollywood’s top two agencies, Creative Artists Agency (CAA) and United Talent Agency (UTA), which represents Cranston, have repeatedly highlighted the potential dangers Sora 2 and similar generative AI platforms pose to clients and their careers.
Nevertheless, on Monday, UTA and CAA released a joint statement alongside OpenAI, Sag Aftra, and the Talent Agents Association, declaring that what transpired with Cranston was inappropriate and that they would collaborate to ensure the actor’s “right to determine how and whether he can be simulated.”
“While OpenAI has maintained from the start that consent is required for the use of voice and likeness, the company has expressed regret over these unintended generations. OpenAI has reinforced its guardrails concerning the replication of voice and likeness without opt-in,” according to the statement.
Actor Sean Astin, the new chair of SAG Aftra, cautioned that Cranston is “one of many performers whose voices and likenesses are at risk of mass appropriation through reproduction technology.”
“Bryan did the right thing by contacting his union and professional representatives to address this issue. We now have a favorable outcome in this case. We are pleased that OpenAI is committed to implementing an opt-in protocol, which enables all artists to decide whether they wish to participate in the exploitation of their voice and likeness using AI,” Astin remarked.
“To put it simply, opt-in protocols are the only ethical approach, and the NO FAKES law enhances our safety,” he continued. The Anti-Counterfeiting Act is under consideration in Congress and aims to prohibit the production and distribution of AI-generated replicas of any individual without their consent.
OpenAI has openly supported the No FAKES law, with CEO Sam Altman stating the company is “firmly dedicated to shielding performers from the misuse of their voices and likenesses.”
Sora 2 permits users to generate “historical figures,” which can be broadly defined as both well-known and deceased individuals. However, OpenAI has recently acknowledged that representatives of “recently deceased” celebrities can request for their likeness to be blocked from Sora 2.
Earlier in the month, OpenAI announced its partnership with the Martin Luther King Jr. Foundation to halt the capability of depicting King in Sora 2 at their request as they “strengthened guardrails around historical figures.”
Recently, Zelda Williams, the daughter of the late actor Robin Williams, pleaded with people to “stop” sending her AI videos of her father, while Kelly Carlin, the daughter of the late comedian George Carlin, characterized her father’s AI videos as “overwhelming and depressing.”
Legal experts speculate that generative AI platforms could enable the use of deceased historical figures to ascertain what is legally permissible.
Source: www.theguardian.com
