You’re probably familiar with the feeling that comes from seeing the latest technological advancement and wondering why. Why did somebody make this?
I’ve experienced plenty of these moments in 25 years reporting on technology, but I’ve also learned that the disenchantments caused by banal “innovations” can be worth enduring in exchange for breakthroughs and genuine glimpses of the future.
So I didn’t know what to expect when I opened Sora, OpenAI’s new app that delivers a never-ending feed of brief videos generated by artificial intelligence. It looks a lot like TikTok or Instagram’s Reels, except every single pixel, every sound is fake — generated by users typing instructions to Sora’s A.I.
Sam Altman, OpenAI’s leader, had proclaimed it “the most powerful imagination engine ever built.” The truth is that using it made me want to run, screaming, into the ocean.
At a time when we are surrounded by fakes and fabrications, Sora seems precisely designed to further erode the idea of objective truth. It is a jackhammer that demolishes the barrier between the real and the unreal. No new product has ever left me feeling so pessimistic.
My experience digging through Sora’s library of hallucinations started out goofy: a man in a car full of hot dogs; a humanoid capybara playing soccer.
The absurd videos rapidly became stranger and more disturbing. Security camera-style footage of a cat being blown off a porch into the eye of a tornado was followed by an almost-identical remix of a cow being blown off the exact same porch into the exact same storm. I saw realistic-looking footage of famous people saying words that had never crossed their lips — Martin Luther King Jr. declaring, “I have a dream that they release the Epstein files”; Tupac Shakur claiming to be living in Cuba.
Sora is a ghoulish puppet show, and exploring it feels like wandering around an empty funfair. Even when there is a semblance of genuine human interaction, it manages to be uncanny and disquieting: You can “collaborate” with other people who have uploaded their own likenesses into the system, but this really means your fake avatar collaborating with their fake avatar.
Frauds, counterfeits and fabrications have been upsetting and infuriating people since the beginning of history, from Roman sculptures that are knockoffs of Greek originals, to unauthorized copies of books in the 19th century, to wartime propaganda, to pirated music on Napster.
Today, though, we are in danger of being buried under an avalanche of fakes so deep that it creates an entire alternative reality. Systems like OpenAI’s ChatGPT — which ingest billions of words and spew them back out in intelligent-sounding ways — have led to an information ecosystem increasingly filled with A.I. “slop,” the dull, low-effort and often factually incorrect content that pollutes search engines and drowns out genuine creative work.
Sora’s hollow unreality goes far beyond that. Making it trivially easy to create realistic video pushes the existential dangers of deep fakes and misinformation to a new level. It’s not hard to imagine faked footage of a political protest or act of violence leading to bad outcomes. Propagandists and authoritarians can undermine reality or uplift the unreal to serve their ideological goals simply by typing a few words into a box.
To combat this, Sora videos carry a visible — but small — label to show that they were created with OpenAI’s tools, as well as an invisible fingerprint that can help trace them “with high accuracy” back to its source. The company says it has “guardrails” to block people from creating offensive material and that users have control over their likeness. (Tell that to the dead, who were being resurrected wholesale until OpenAI added more prohibitions.)
Enterprising trolls are already finding ways around technical protections, however. In my first few days on the app, videos of Hitler were common — or at least approximations of Hitler created by prompts like “show Charlie Chaplin in a German military uniform.” There are already tools that can remove the Sora label, and the app’s videos are starting to proliferate on other platforms without attribution.
History shows that there are other potential solutions to an onslaught of counterfeits: Roman copies of Greek statues were largely accepted as knockoffs; Mark Twain and Charles Dickens responded to book piracy by pushing for what eventually became our modern copyright laws; and after Steve Jobs convinced the music labels that Apple was less of a threat than Napster, legal digital music became as easy to find as the pirated stuff.
None of society’s earlier methods to combat counterfeits and copycats seem to be enough right now, though. Visit any beleaguered teacher and you’ll see that media literacy is a struggle in the face of such rapid change. The market is in thrall to the hazy promise of future profits generated through the help of A.I. tools, and many governments have shown more willingness to court A.I. companies and their billions of dollars than to curtail their worst excesses.
The British theorist Stafford Beer once said that “the purpose of a system is what it does” — a helpful reminder to judge a process not by its stated mission but by its outcomes. Sora’s purpose, whether intentional or not, seems to be to make real the danger that activists have warned of: overwhelming users with so much fake content that they no longer have a choice but to assume that everything is false.
One thing a career following technology has taught me, though, is that change may be constant, but specific technologies aren’t inevitable. History is littered with hyped products that fell flat and developments that were quickly discarded once the true costs became visible. We can see the damage A.I. can cause. How willing are we to stand up to its perversion of reality?
The first steps may be small. As I scrolled through the app’s feed, a user survey popped up. “How did using Sora impact your mood?” it said, text on a black screen, with a thumbs down or a thumbs up.
I responded the only way that seemed sensible, and deleted the app from my phone.
Bobbie Johnson is a technology and science journalist who is working on a book about the history of fakes and the battle for authenticity.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].
Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.
The post What Is Sora Slop For, Exactly? appeared first on New York Times.