UPDATED with SAG-AFTRA statement: SAG-AFTRA has followed UTA , CAA and the MPA in sounding the alarm at Sora 2, the newest version of Open AI’s video generating app.
SAG-AFTRA President Sean Astin and National Executive Director & Chief Negotiator Duncan Crabtree-Ireland insisted in a joint statement on Thursday that art is about connection and performance, not simulation.
“The world must be reminded that what moves us isn’t synthetic. It’s human,” they wrote.
The duo called out the tech companies and the media for creating “a sensationalized narrative, designed to manipulate the public and make space for continued exploitation.”
Specifically, with the focus on Tilly Norwood, they say, news outlets anthropomorphize a batch of code and “tease the story of a more realistic-looking artificial creation as an entertainment industry ‘breakthrough’ or breathlessly stoke a non-existent ‘star signing’ competition among agencies.”
Politicians come in for some blame for not regulating Artificial Intelligence and protecting creators.
“Tilly is not the threat, the real danger comes from an unregulated environment that can only flourish by stealing digital information from artists and companies and using it without ethics or respect. This story of creating synthetic characters is not about novelty. It’s about authorship.”
Some argue that, with AI, the author is be the human delivering prompts to the software. Astin and Crabtree-Ireland acknowledge those human inputs, but call such a process an “insult” to artistry.
“Yes, there is human effort in assembling synthetic imagery or voices like Tilly Norwood,” they write. “But that process undermines the very ecosystem that makes storytelling possible. It insults the artistry of our performers, assaults our business, and threatens the legacy our members’ work creates, in many cases built over generations.”
See their joint statement in full at the bottom of this post.
“There is no substitute for human talent in our business, and we will continue to fight tirelessly for our clients to ensure that they are protected. When it comes to OpenAI’s Sora or any other platform that seeks to profit from our clients’ intellectual property and likeness, we stand with artists,” the agency said.
“The future of industries based on creative expression and artistry relies on controls, protections, and rightful compensation. The use of such property without consent, credit or compensation is exploitation, not innovation.”
Sam Altman’s Open AI, parent of ChatGPT, just unveiled Sora 2— the apps latest, and most threatening to Hollywood, version of Sora. Yesterday, CAA weighed in, saying it “is unwavering in our commitment to protect our clients and the integrity of their creations. The misuse of new technologies carries consequences that reach far beyond entertainment and media, posing serious and harmful risks to individuals, businesses, and societies globally. It is clear that OpenAI/Sora exposes our clients and their intellectual property to significant risk.”
And earlier this week, the MPA chief Charles Rivkin blasted the app, noting that “Since Sora 2’s release, videos that infringe our members’ films, shows, and characters have proliferated on OpenAI’s service and across social media.”
The Sora generative video model allows users to create social-media-ready videos with just a brief text prompt. The result can be a product of the user’s imagination, or a fan-fiction-like story using recognizable properties.
In response to the furor from content creators over the unauthorized use of their works, OpenAI’s Sam Altman walked back the company’s initial approach of having IP owners opt out of having their stuff fed into the model. He wrote last week that “we will give rightsholders more granular control over generation of characters, similar to the opt-in model for likeness but with additional controls.”
He also held out the possibility of remuneration down the line. But it’s still vague. “We are going to have to somehow make money for video generation … We are going to try sharing some of this revenue with rightsholders who want their characters generated by users. The exact model will take some trial and error to figure out, but we plan to start very soon. Our hope is that the new kind of engagement is even more valuable than the revenue share, but of course we want both to be valuable.”
Hollywood is skeptical and while studios haven’t sued Open AI yet they have been testing the waters with smaller companies from San Francisco-based Midjourney to Chinese AI firm MiniMax to various cease and desist letters.
Here is the statement from SAG-AFTRA’s Astin and Crabtree-Ireland:
A.I. developments are in the headlines. Here’s what SAG-AFTRA is doing.
Last week, dramatic headlines and a crescendo of news stories broke regarding an A.I.-generated synthetic character called “Tilly Norwood” and the release of Sora 2 by OpenAI. These developments reignited debate about how artificial intelligence is reshaping the film and television industry. The confluence of these stories has broken through the relentless stream of news and information on this subject and focused our collective curiosity and anxiety about the future of creativity itself at this moment. It’s an opportune time to demonstrate how our union is fighting to protect our interests and speak in defense of what it means to be a performer.
Let’s be clear: Tilly Norwood is not a person. It’s a synthetic construct generated by software trained on the work of countless professional performers, real human beings, whose work was taken without permission, without credit and without compensation. When news outlets tease the story of a more realistic-looking artificial creation as an entertainment industry “breakthrough” or breathlessly stoke a non-existent “star signing” competition among agencies, it misses the fundamental truth: Tilly is not the threat, the real danger comes from an unregulated environment that can only flourish by stealing digital information from artists and companies and using it without ethics or respect. This story of creating synthetic characters is not about novelty. It’s about authorship, consent and the value of human artistry. Anthropomorphizing the synthetic fake, giving it a memorable name and playing up the character’s representation of beauty is not objective or authentic; it is a distraction, a misdirection from what is actually taking place. What has been created is a sensationalized narrative, designed to manipulate the public and make space for continued exploitation.
The public release of Sora 2 and its remarkably advanced capabilities excited some observers. For many more of us, this lightning-fast technological evolution brings profound concern. OpenAI’s decision to honor copyright only through an “opt-out” model threatens the economic foundation of our entire industry and underscores the stakes in the litigation currently working through the courts. If A.I. companies can shift the burden to rightsholders to opt out, what does copyright really mean? Opt-out isn’t consent — let alone informed consent. That’s why SAG-AFTRA fights for opt-in approaches. No one’s creative work, image, likeness or voice should be used without affirmative, informed consent. Anything less is an unjustifiable violation of our rights.
That said, Sora 2’s approach to image, likeness and voice replication through its “cameo” function deserves recognition. This feature allows you to create a digital replica of yourself and control its use within Sora 2, including whether others can access it. Critically, this approach is opt-in, which makes all the difference. While the controls and details remain imperfect, they incorporate core principles of informed consent and implement them systematically. This reflects months of dialogue between SAG-AFTRA and OpenAI, along with the dedicated work of our A.I. Taskforce and staff. We hope more A.I. companies will follow OpenAI’s lead in this respect.
But fundamentally, we must remember and remind everyone that audiences don’t build emotional connections or lifelong relationships with algorithms. They connect with artists. They see themselves reflected in real human performances, in the joy, heartbreak, jealousy, love, resilience and truth that only a person can express. From Homer and Shakespeare to today’s storytellers, performance has always been a mirror of our shared humanity. No dataset or generative model can capture that spark. A.I. is getting more realistic looking, but audiences will always gravitate to that which is real and true. They want to know that the artist really feels what they are feeling, in a laugh, a sob, a smile or a tear. When they are moved, they want to know that what they are feeling is real. They want proof. The intent of the artist is what makes it real. If you scrape, feed or otherwise deconstruct the love or anger into a trillion component parts and then reassemble them to be manipulated and repurposed without telling the audience exactly where it came from, you are cheapening their experience and unmooring their reality.
Yes, there is human effort in assembling synthetic imagery or voices like Tilly Norwood. But that process undermines the very ecosystem that makes storytelling possible. It insults the artistry of our performers, assaults our business, and threatens the legacy our members’ work creates, in many cases built over generations.
That’s why SAG-AFTRA has fought and will continue to fight for strong, enforceable protections. Throughout labor history, when new technologies emerge and are adopted by business, workers are disadvantaged and unions must fight to protect them.
In 2017, your union leadership formally identified this threat and began working passionately, creatively and unceasingly to combat it. Many of our members don’t know that we have partnered with policymakers, in some cases even drafting A.I. protection legislation. We have been navigating the complexities of intellectual property law and more, because the power of our contracts must be amplified by the law, and so far, that law barely exists.
The protections your union has already secured:
In the 2023 strike, SAG-AFTRA won its first-ever, enforceable protections around artificial intelligence. Employers must obtain clear, informed consent before creating or using a digital replica. They must pay fairly for that use. They cannot reuse a scan or a performance indefinitely without new bargaining and compensation. And they cannot deploy synthetic performers in covered film, television and streaming projects, except under strict conditions.
Our contracts across film, television, commercials and other areas generally require notice and bargaining when synthetic performers are going to be used.
If a synthetic character is created by prompting a generative A.I. system with a performer’s name (with the addition, in some cases, of a major facial feature), the performer’s consent is required.
What the law does not yet protect:
Our contracts bind only signatory employers. They can’t stop A.I. developers from scraping performances off the internet or from training models on decades of film and television without permission. That is why SAG-AFTRA has been leading the fight for stronger laws:
The No FAKES Act would prohibit unauthorized digital replicas.
The TRAIN Act would require transparency in training datasets.
The A.I. Accountability & Data Protection Act would create a federal tort for unauthorized use of biometric data.
And in California and New York, we’ve championed legislation requiring transparency and disclosure when synthetic performers are used.
Where we go from here:
We know that some companies will continue to push the limits, marketing “synthetic performers” as the next big thing. And while tools like Sora 2 generate awe for their technical prowess, audiences continue to show that what moves them most is not simulation, it’s sincerity. The real connection happens only when a living performer brings a story to life.
Our commitment is simple and our position is unwavering:
Performance must remain human-centered.
A.I. can enhance creativity, but it must never replace it.
A.I. use must be transparent, consensual, and compensated.
We invite you to learn more about your protections and our ongoing advocacy at sagaftra.org/ai.
This moment is noisy, but it’s also clarifying. The world must be reminded that what moves us isn’t synthetic. It’s human. And as long as SAG-AFTRA exists, that humanity will be defended.
With respect and resolve,
Sean AstinPresident
Duncan Crabtree-IrelandNational Executive Director & Chief Negotiator
The post SAG-AFTRA’s Sean Astin Comes Out Swinging Against Sora, Politicians, The Media & AI Anthropomorphism appeared first on Deadline.