OpenAI wowed the tech community and many in media and arts earlier this year — while also raising feathers of traditional videographers and artists — by showing off a new AI model called Sora that makes realistic, high-resolution and smooth video up to 60 seconds per clip.
The tech remains unreleased to the public for now — OpenAI said at the time, back in February 2024, that it was making Sora “available to red teamers to assess critical areas for harms or risks” and a selected small group of “visual artists, designers, and filmmakers.” But that hasn’t stopped some of the initial wave of users from making and publishing new projects with it.
Now, one of OpenAI’s handpicked Sora early access users, writer/director Paul Trillo, who was among the first in the world in March to demo third-party videos made with the model, has created what is being called the “first official music video made with OpenAI’s Sora.”
The video was made for indie chillwave musician Washed Out (Ernest Weatherly Greene Jr.) and his new single “The Hardest Part.” It is essentially a 4-minute long series of connected, quick zoom shots through different scenes that have all been stitched together to create the illusion of a continuous zoom. Watch it below:
On his account on the social network X, Trillo posted that he’d first had the idea for the video 10 years ago, but abandoned it. He also replied to questions from his followers and stated that the video was made from 55 separate clips generated by Sora out of a pool of 700 total, and stitched together in Adobe Premiere.
Separately but relatedly, Adobe recently announced it was looking to add Sora and other third-party AI video generator models into its subscription Premiere Pro software, but no timeline has been set for this integration, so in the meantime, those looking to emulate Trillo’s workflow would have to generate AI video clips in other third-party software such as Runway or Pika (since Sora remains non-public) and then save and import them into Premiere. Not the end of the world, but not as seamless as it could be.
In an interview with the Los Angeles Times, Washed Out/Greene said “I look forward to being able to incorporate some of this brand-new technology and seeing how that informs what I can come up with. So, if that’s pioneering, I would love to be part of that.” The interview also goes over some of the specific prompts used:
“Greene needed to write prompts with enough specific details about not just the image itself but the shot angles and movements of the characters. ‘We zoom through the bubble it pops and we zoom through the bubblegum and enter an open football field,’ Trillo wrote as part of his prompt for one brief snippet of video. ‘The scene is moving rapidly, showing a front perspective, showing the students getting bigger and faster.’
Trillo also posted that used only the model’s text-to-video capabilities, rather than taking still images captured or generated elsewhere and feeding them into the AI to add motion (a popular tactic among artists in the quickly evolving AI video scene).
This example shows the power of Sora in creating media with AI and is a helpful rejoinder to the recently revealed information that another one of the first demo videos made by Canadian creative studio Shy Kids called “Air Head,” featuring a man with a balloon for a head, actually leaned heavily on other VFX and video editing tools such as rotoscoping in Adobe After Effects.
It also shows the continual appetite among some creatives — in music and video — to use new AI tools to express themselves and tell stories, despite many other creatives criticizing the technology and OpenAI in particular as being exploitative and violating copyright of human artists by scraping and training on their prior works without informed consent or compensation.
The post The first music video generated with OpenAI’s unreleased Sora model is here appeared first on Venture Beat.