
-
Entertainment
The Breaking Bad actor never signed up to be featured on OpenAI’s video-sharing platform Sora, yet videos of him certainly surfaced.
The Breaking Bad actor never signed up to be featured on OpenAI’s video-sharing platform Sora, yet videos of him certainly surfaced.


Actors, studios, agents, and the SAG-AFTRA actors union have raised their worries regarding participation in Sora 2’s AI-generated videos since the deepfake technology was launched last month. Now a collaborative statement from actor Bryan Cranston, OpenAI, the union, and others indicates that after videos of him emerged on Sora — including one where he appeared taking a selfie with Michael Jackson — the firm has “enhanced safeguards” around its opt-in policy concerning likeness and voice.
The joint statement expressed that OpenAI “regretted these unintended creations.” It also included endorsements from talent agencies such as the United Talent Agency, the Association of Talent Agents, and the Creative Artists Agency, which had criticized the company’s insufficient protections for artists in prior instances. OpenAI did not detail how it intended to modify the app, nor did it respond to The Verge’s request for comments before publication.
OpenAI appeared to reaffirm its dedication to stronger safeguards for those who opt out: “All artists, performers, and individuals will have the authority to decide how and whether they can be replicated.” It further stated it would “promptly” address complaints regarding policy violations.
Cranston expressed his appreciation to OpenAI for its policy and for enhancing its safeguards. While Cranston’s situation reached a satisfactory outcome, SAG-AFTRA president Sean Astin indicated in the joint statement that performers require legislative protections from “extensive misappropriation via replication technology,” referring to the proposed Nurture Originals, Foster Art, and Keep Entertainment Safe Act, or NO FAKES Act.
OpenAI introduced Sora 2 with a policy allowing copyright holders to opt-out, before changing direction following public backlash and videos of Nazi SpongeBob, pledging to “provide rightsholders with more granular authority over character generation, similar to the opt-in framework for likeness but with additional controls.”