CMU School of Drama


Friday, October 31, 2025

OpenAI Works With SAG-AFTRA, Agencies on Sora 2 Guardrails After Pressure From Bryan Cranston

TheWrap: After alarm bells rang across Hollywood over misuse of OpenAI’s new video generation software Sora 2, the artificial intelligence company has agreed to work with SAG-AFTRA and top talent agencies to establish guardrails against unauthorized voice and likeness replications.

4 comments:

SapphireSkies said...

I'm hopeful to see that the protections against AI that SAG-AFTRA fought for are being utilized. However, I think it will be really interesting and telling just how well OpenAI is able to keep to this promise and this contract that they've signed. It's interesting that they already broke a promise and are only now looking at something that was available on launch now that someone has complained. it makes you think that they were always planning on circumventing the rules unless someone called them out on it. It almost feels predatory, because you can imagine them doing it to a much smaller artist that may not have enough sway to move the union or to move people to support them in fighting against this cause. For example, an actor who's just starting out who's been in one hit maybe a recognized enough name to get people to try and create other things with their likeness, but not a recognized enough name to have the same sort of name recognition of people who are willing to come to that person's defense like they would for Brian Cranston.

CaspianComments said...

I’m glad to see Bryan Cranston speak up about this issue and collaborate with SAG-AFTRA in order to push for guardrails. This is a step in the right direction in making sure that AI is properly limited and creatives don’t lose their jobs. I hope we will continue to see this trend and that the companies behind these AI actually uphold their end of the deal and start limiting it. Unfortunately, I can’t help but think that they might not and may cause further issues. If that happens to be the case, I hope that SAG-AFTRA and other unions representing creatives work to kill them in court and push for legislation against AI. Of course, with the current administration in America, this may very well prove to be difficult or maybe even impossible. However, I believe that even then, they should push for it, because eventually (hopefully) the morons will be kicked out of office and there will be people who understand the need for guardrails on AI.

Aiden Rasmussen said...

I really appreciate these groups' efforts to protect performers and artists from the misuse of AI. I wish that it didn’t always take a misstep and then some backlash for something to change, but if that’s the only way it’ll happen, so be it. I’m more and more hopeful as I hear people talk about AI that it won’t be used to replace artists. I think collaboration between these generative AI companies and production companies and unions is necessary to protect everyone involved in making this art. It’s very frustrating that so many of these AI groups don’t realize how important humans are to the concept of art, but if artists keep fighting for each other, it shouldn’t matter. I really appreciate that Bryan Cranston is getting himself involved in the effort, and hope that more big voices speak out as well. I’m really hopeful in general that we keep fighting to protect each other as AI grows.

Rachel N said...

THANK. GOD! And with that optimism out of the way, my actual thoughts on this are hesitant to say the least. When I first heard about Sora 2, the potential for misuse and extremely devastating stories to come out of this were the first things that came to mind. An AI video-generation platform may seem cool in concept, and clearly many people think so due to the popularity of the application since its release, but the consequences are going to be detrimental to everyone, even those who aren’t involved. The fact that Sora is so easily accessible and can be so easily bypassed of “protecting” the impersonation of celebrities is absolutely horrifying. Though this article clarifies how Sora’s developers can ensure guardrails are implemented, the fact that realistic impersonation of others is something that can even be accomplished by anyone using the platform makes it dangerous to exist by principle. I’m seriously unsure of the validity and integrity of Sora’s developers to implement these guardrails, and though this is in theory good news, woefully we’ll just have to see how it plays out.