
Chip Somodevilla/Getty Images
- Everyone makes mistakes.
- OpenAI wants you to think its mistakes are just a product of a young company moving fast.
- That may be part of it. But it’s also beginning to look like a strategy: Asking forgiveness instead of permission.
OpenAI says it’s sorry it used someone’s intellectual property without their permission. And it promises to do better in the future.
Quiz time!
Are we talking about:
OpenAI’s announcement on Thursday night that it had “paused” the ability for Sora users to make videos using the likeness of Martin Luther King Jr., after King’s estate complained?
Or are we talking about OpenAI’s announcement earlier this month, when it said it would make it harder for Sora users to make videos using the likeness of Hollywood characters, after Hollywood complained?
Or are we talking about OpenAI’s announcement last year, when it said it would stop using a computer-generated voice that sounded a lot like Scarlett Johansson — after Johansson complained, and said she’d turned down OpenAI’s offer to pay her for her voice?
You can see where we’re going here. Let’s spell it out: OpenAI is building a track record of using stuff it may not have the rights to use — and only backtracking once it hears from rights owners and their lawyers.
Which leaves us two ways to think about that track record:
- It’s possible that OpenAI is a $500 billion company but is also a clumsy startup that moves fast and makes mistakes, and it’s going to keep doing that.
- It’s also possible that when it comes to intellectual property — whether we’re talking about the stuff it hoovers up to train and power its artificial intelligence engine, or the output those engines create — OpenAI is intentionally ignoring concerns about who owns and controls that intellectual property.
My hunch: It’s a bit of both. Which is what OpenAI and its leadership have said at various times.
“Please expect a very high rate of change from us,” OpenAI CEO Sam Altman wrote earlier this month, when he announced he was softening what had been a very aggressive stance toward Hollywood. “We will make some good decisions and some missteps, but we will take feedback and try to fix the missteps very quickly.”
But a day before, OpenAI executive Varun Shetty had made it clear that OpenAI’s stance toward Hollywood and copyright wasn’t an accident, but a conscious choice. Sora had launched with minimal restrictions because other AI-powered media-makers did the same thing. “We’re also in a competitive landscape where we see other companies also allowing these same sorts of generations,” Shetty told journalist Eric Newcomer. “We don’t want it to be at a competitive disadvantage.”
All of which means we should expect OpenAI to keep following the same pattern: Use something it may not have the rights to use, and figure out the details later. Whether it’s doing that intentionally or mistakenly is almost beside the point.
And all of this certainly will get worked out over time, as OpenAI and its competitors strike rights deals with some companies (Disclosure: OpenAI has a commercial deal with publisher Axel Springer, which owns Business Insider) and fight others in court.
But let’s zoom out. Should you, a normal person, care about the way OpenAI works — or fights —with intellectual property owners?
Look: I’m flattered and pleased that you’re reading this story. But it’s probably not going to impact your life that much.
On the other hand: OpenAI certainly seems like it’s going to be one of the leading AI companies that is going to reshape a lot of our lives. But in order for that to work, it’s going to need to interact with lots of different companies and industries.
And that “agentic future” OpenAI and others talk about — the one where AI bots perform all kinds of tasks for you — will only work if everyone involved trusts the rules won’t keep changing. Asking for forgiveness instead of permission has worked for OpenAI so far. At some point, it won’t.
Read the original article on Business Insider
The post How many times can OpenAI say, ‘Oops?’ appeared first on Business Insider.