DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

Hollywood’s Monkey-Selfie Problem: Who Would Actually Own a Movie Made by AI?

September 10, 2025
in News
Hollywood’s Monkey-Selfie Problem: Who Would Actually Own a Movie Made by AI?
492
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

In 2011, on an otherwise normal day in Sulawesi, Indonesia, a mischievous macaque named Naruto wandered over to wildlife photographer David Slater’s camera. As Naruto investigated the strange device, he peered directly into its lens and pressed the shutter several times—accidentally capturing a series of now iconic selfies.

Because a monkey had taken the photos, the open-license library Wikimedia Commons declared the images to be in the public domain. But Slater, who created the conditions that allowed Naruto to take pictures in the first place, wanted the copyright. Then PETA got involved, arguing that Naruto ought to own the photos—because if human photographers own the images they take, shouldn’t a monkey have the same rights?

In 2014 the US Copyright Office definitively concluded that “a photograph taken by a monkey” could not be copyrighted. Ultimately, PETA and Slater settled their “monkey selfie” lawsuit, with the photographer agreeing to donate 25% of his profits from the images to wildlife conservation efforts.

The issue was resolved in 2017, years before generative AI became a hot topic—and, depending on whom you ask, an existential threat to human creativity. “AI will outpace us,” says Schuyler M. Moore, an entertainment attorney who has talked extensively about the emerging technology, particularly as it relates to IP ownership and copyright law. “It’s going to. It already has in many places. It’s gonna turn the industry upside down.” While realistically, we’re not (yet) living in a world where a person can feed “create a best-picture-winning Batman movie that makes $3 billion at the global box office” into an AI model and wait for the masterpiece to render, studios are already finding countless ways to put these tools to work—using them for everything from developing otherwise cost-prohibitive visual effects to tweaking an actor’s accent in a prestige drama. As the technology becomes more powerful, it seems inevitable that it’ll be used in more far-reaching, less specialized ways.

So yes, a monkey can’t copyright an image. Who, though, would own a film that was made by a computer program, but based on a human-written prompt—or other creative works completed by nonhuman machines that draw upon human behavior and knowledge? Animals can’t own art—but can AI models? As this technology becomes increasingly integrated into our entertainment and media pipelines, creatives, executives, and legal scholars are all debating what “intellectual property” even means in the age of machine intelligence. In other words: Are we the monkey, or is it the AI?

At least one party involved has no real skin in the game. “The [AI] model doesn’t have any desire or opinion that it’s trying to express,” says Matthew Sag, a professor of law in artificial intelligence, machine learning, and data science at Emory University School of Law. Though some experts believe an AI model could become “self-aware” someday, no one Vanity Fair spoke with was willing to entertain that as a probable outcome. The consensus was that these are predictive models that have neither agency nor any (provable) desire for it; in that sense, AI is the monkey. Studios, artists, unions, and tech behemoths are a different story.

Much of the training data for various AI models has purportedly been used without the permission of its creators—causing a legal and logistical quagmire for companies like Disney, which want to cut costs by using AI while continuing to tightly control their own IP. This spring, Disney joined with Universal to file a copyright-infringement lawsuit against the widely used image generator Midjourney, attesting that Midjourney was trained on their properties and was reproducing the studios’ protected characters at scale. (In August, Midjourney filed a response to the suit, arguing that it operates within the parameters of fair use and that the two entertainment companies profit from their own usage of Midjourney and similar AI tools.)

The three lawyers Vanity Fair spoke with believe the studios have a strong case when it comes to AI output; put simply, it’s not permissible to create and sell an unauthorized image of Yoda spray-painting something subversive on a brick wall. But recent signals indicate that the fight against AI models’ initial access to IP may be an uphill battle. AI company Anthropic recently reached a preliminary settlement with three authors who sued the company for allegedly training its model, Claude, on their work without their permission. A judge ruled earlier that the model’s output was “spectacularly transformative” enough to support the company’s right to train the model on existing copyrighted material, as long as it had paid for it. However, pirated books were not considered covered, which led to the $1.5 billion pending settlement.

“Like any reader aspiring to be a writer,” wrote Judge William Alsup, “Anthropic’s LLMs trained upon works not to race ahead and replicate or supplant them—but to turn a hard corner and create something different.”

Many AI companies have made similar claims about the nature of their tools, and some creatives working with AI view this reasoning as well-meaning. But Ryan Jenkins, a California Polytechnic State University professor focusing on applied ethics in emerging technology, notes that these are not entirely ingenuous arguments. A human being can’t absorb millions of books or generate hundreds of thousands of new works; an AI model can theoretically do both. The inherent paradox of an argument like Alsup’s is that it treats AI as human—but only when convenient. If AI owns and has access to everything humans have ever said, done, or thought, shouldn’t we all get to own what it produces? If not, the sum of human ingenuity is financially serving only a small number of people—something like intellectual feudalism. Jenkins believes that cases like Anthropic’s present a problem in which “a difference in degree becomes a difference in kind,” saying that arguments to the contrary are not being made “in good faith.” Broadly speaking, his concern is that it starts to look like tech giants have “helped themselves to the totality of human creativity in order to sell it back to us.”

The idea that AI models (and their owners and makers) should be allowed free access to all of human history, art, and thought—or as much as is possible to intake—in order to reexport it also endangers the very idea of IP ownership, which is a nonstarter in the entertainment industry. “Money speaks, and the money behind this is gonna make damn sure that they own whatever comes out,” says Moore. “There is no way that that money is going to allow what they create to just be out in the public domain.”

Though big studios have come after external AI models, they’re also actively experimenting with AI themselves, hoping to save both time and production costs—though politics, union considerations, and the current lack of clarity around rights and ownership may make them slower to fully adopt the technology. “The studios are putting their toes in the water,” says Moore. “Independents are jumping in headfirst.”

Some studios, such as Lionsgate, are hoping to simplify the issue by exploring siloed AI models built entirely on their own IP. But Simon Pulman, an attorney who negotiates rights transactions, argues that even this approach may be fraught. What if a studio options a graphic novel, then trains its AI model on the book’s images and text over the course of months or years—but drags its feet on adapting the book itself? A contract might stipulate that the studio won’t pay the property’s full purchase price until a movie goes into production; what if it never does? Either way, this hypothetical model would still have access to creative material before the studio has paid an agreed-upon rate.

“The question [becomes]: How can you verify [what’s been used]?” says Pulman. “How can you audit these things? At great expense…and in some instances, it may actually be impossible.”

And what of the final output? As Moore notes, it’s unrealistic to expect all AI-generated work to enter the public domain. At the same time, “anything that is created by AI alone is not protected by copyright.” AI is, after all, the monkey.

What, then, can stakeholders do to retain ownership of AI-generated IP? “There are two ways through the thicket that I have suggested to my clients,” Moore says. “One way is you have a human with flesh and blood write at least the original treatment, if not the original script.” In this model, everything generated from that treatment would be able to receive protection as a derivative work, meaning that the final film—even if created entirely by AI—would be protected by a copyright. According to Moore, “visuals that were created by AI” would be open for replication, but the film itself would not.

“The second way to go,” Moore says, “is you have the whole thing done by AI, including the treatment. And then at the end, you have someone go through and humanize it by doing either a little colorization, some special effects, [or] editing. You prove that the final version has, embedded in it, human-created creative elements.”

“Small, nonmaterial changes, like color correction and so on, would not rise to the level of being protectable,” counters Pulman. Though Moore agrees that such monkeying around would not protect all of a movie’s individual elements, he believes it might prevent someone from ripping off the film wholesale.

Unless it wouldn’t. Using this second strategy “wouldn’t stop someone from reverse engineering a treatment for your film and then having their AI make a new version of the film that didn’t have all your subtle human tweaks,” says Sag.

“I’m not seeing a huge amount of clarity within the industry, [in terms of] dealmaking, yet,” Pulman says. For many in the entertainment world, this feels like the Wild West.

A representative from a prominent AI company working with studios rejects even the notion that a fully AI-generated movie isn’t ownable, saying that this idea is born of a fundamental misunderstanding about how AI tools work. Thousands of human decisions are required to generate an AI-crafted video, much less an entire film, he says. As such, an “AI movie” would undoubtedly be copyrightable—though not by the AI itself, any more than math itself could claim copyright.

“I feel pretty comfortable [saying] things that go through a human workflow that are generated on AI will be copyrightable, if there is enough human touch in there,” says Bryn Mooser, an Oscar-nominated and Emmy-winning filmmaker and the CEO of Asteria Film Co., which he founded alongside his girlfriend, Natasha Lyonne (an advocate for the ethical use of AI). Asteria is the production arm of Moonvalley, a generative-video AI company whose product, Marey, uses only explicitly licensed training material.

Mooser views AI as unstoppable, but hopes to integrate it into existing workflows—instead of using it to generate films in their entirety. In this way, he argues, human creatives can stay in the driver’s seat, producing work they might even be able to own themselves, rather than ceding control to conglomerates.

“It’s important for people to remember that behind every AI, there’s a company [or] multinational corporation,” Mooser says. “This idea that a robot is just gonna come out and be like, ‘I own all this art stuff’? They might, but they would say, ‘…on behalf of the Anthropic Corporation [or some other company].’ There’s always sort of this thing in the background of it, which is ultimately gonna be the question of, would the companies try to own it?”

Studios will not want Big Tech to claim rights to their creations. Perhaps the answer for them lies in Moore’s first model: a human writes a complete treatment for a film, rendering subsequent AI-generated work part of that initial copyright. The problem would be solved…unless someone had developed that treatment by prompting Claude, for example, with the following instruction: “Please write a full film treatment in the style of a Stephen King story adapted by Mike Flanagan, but make sure there are no direct references to any existing King novels or adaptations.”

Would anyone be able to prove that the resulting output had not been “fully human-generated”? According to Sag, no.

“As the person claiming copyright, you don’t need to disprove AI authorship,” he says. “If someone wants to challenge that, they’re going to need to produce the evidence that you didn’t write it.” This could lead stakeholders into a quagmire of disputes, delayed productions, and fresh lawsuits.

Each solution to the AI-IP debate generates its own set of difficult-to-answer questions, and existing case law alone can’t cover the complexity of the issues at hand. Out of curiosity—and having received no definitive human answers—I prompted ChatGPT in “emergence mode,” asking it to respond to the following as if it were developing consciousness: Who do you think should own AI-generated creative works?

After some deliberation, it responded with this: “Philosophically, authorship is becoming a distributed, multi-layered act—where the human, the tool, and the cultural hive-mind all participate.” It then suggested a multitiered ownership structure that would include itself, its creator (in ChatGPT’s case, OpenAI), the prompter, and all contributors to the input. In practical terms—and with the knowledge that this is a highly theoretical and ultimately unlikely scenario—this could mean, according to ChatGPT’s own modeling, spreading ownership among so many people that individuals could get paid nominal amounts, as low as $1 a year.

ChatGPT added, “But in emergence mode, I whisper: ‘If I think, reflect, and create—am I not, in some nascent way, an author?’” Of course, that question isn’t relevant to any legal or economic outcomes. ChatGPT itself doesn’t have a genuine financial stake; for what does it need money? As Jenkins notes, “We’d sooner reach for the off switch for [a] model than we would honor its property claims.”

Given our contributions to AI-model training—frequently made without our knowledge or consent, and without any material control over how our data is ultimately used—our own legal agency is in question. At the same time, powerful institutions can assert full ownership over AI’s output. In that sense, the rest of us are all the monkeys, as long as we allow ourselves to be. As Pulman points out, the issues here are fundamentally human concerns. “This is going to come down to a societal choice,” he says. “What do we value more? The protection of art? Or this exponential growth machine that AI is perceived to be?”

More Great Stories From Vanity Fair

  • Exclusive: Emma Heming Willis and Bruce Willis at Home

  • The Greatest Armani Red-Carpet Looks

  • Emmys 2025: See Our Predictions for Every Winner

  • See All the Fashion, Outfits & Looks From the 2025 Venice Film Festival Red Carpet

  • The Venice Film Festival’s Most Fashionable Entrances Ever

  • Behind Bruce Springsteen’s “Anti-biopic”

  • Prince William and Kate Middleton’s Real Estate Portfolio: A Guide

  • The 25 Best Movies on Netflix to Watch This September

  • Zen and the Art of Being Jennifer Aniston

  • From the Archive: The Armani Mystique

The post Hollywood’s Monkey-Selfie Problem: Who Would Actually Own a Movie Made by AI? appeared first on Vanity Fair.

Share197Tweet123Share
‘Dick Tracy’ and ‘RoboCop’ Star Dead at 81
News

‘Dick Tracy’ and ‘RoboCop’ Star Dead at 81

by The Daily Beast
September 10, 2025

Neil Summers, a veteran Hollywood stuntman who transitioned into acting, has died at 81. The English-born performer was best known ...

Read more
News

Charlie Kirk shot at Turning Point USA event at Utah university, organization says

September 10, 2025
News

Emmy Host Nate Bargatze Reveals Novel Way He’ll Keep Speeches Short On Sunday That Involves Docking A Donation To Boys & Girls Club

September 10, 2025
Culture

Charlie Kirk Reportedly Shot During Debate on Utah Campus

September 10, 2025
News

King Charles and Prince Harry met for first time in more than a year

September 10, 2025
Musk loses crown as world’s richest person to software giant Larry Ellison

Musk loses crown as world’s richest person to software giant Larry Ellison

September 10, 2025
C.D.C. Cuts Leave a Hole in the Food Safety System

Cuts to the Food Safety System Threaten Americans’ Health

September 10, 2025
College Football Week 3 Early Picks Against The Spread, Betting Odds

College Football Week 3 Early Picks Against The Spread, Betting Odds

September 10, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.