AI

Exclusive: Watch the world premiere of the AI-generated short film The Frost.

1 Mins read

The trouble is that not every business has a website or images to draw from, says Parker. “An accountant or a therapist might have no assets at all,” he says. 

Waymark’s next idea is to use generative AI to create images and video for businesses that don’t yet have any—or don’t want to use the ones they have. “That’s the thrust behind making The Frost,” says Parker. “Create a world, a vibe.”

The Frost has a vibe, for sure. But it is also janky. “It’s not a perfect medium yet by any means,” says Rubin. “It was a bit of a struggle to get certain things from DALL-E, like emotional responses in faces. But at other times, it delighted us. We’d be like, ‘Oh my God, this is magic happening before our eyes.’”

This hit-and-miss process will improve as the technology gets better. DALL-E 2, which Waymark used to make The Frost, was released just a year ago. Video generation tools that generate short clips have only been around for a few months.  

The most revolutionary aspect of the technology is being able to generate new shots whenever you want them, says Rubin: “With 15 minutes of trial and error, you get that shot you wanted that fits perfectly into a sequence.” He remembers cutting the film together and needing particular shots, like a close-up of a boot on a mountainside. With DALL-E, he could just call it up. “It’s mind-blowing,” he says. “That’s when it started to be a real eye-opening experience as a filmmaker.”

Chris Boyle, cofounder of Private Island, a London-based startup that makes short-form video, also recalls his first impressions of image-making models last year: “I had a moment of vertigo when I was like, ‘This is going to change everything.’”

Boyle and his team have made commercials for a range of global brands, including Bud Light, Nike, Uber, and Terry’s Chocolate, as well as short in-game videos for blockbuster titles such as Call of Duty.

Private Island has been using AI tools in postproduction for a few years but ramped up during the pandemic. “During lockdown we were very busy but couldn’t shoot in the same way we could before, so we started leaning a lot more into machine learning at that time,” says Boyle.


Source link

Related posts
AI

Google AI Releases Gemini 2.0 Flash: A New AI Model that is 2x Faster than Gemini 1.5 Pro

2 Mins read
Google AI Research introduces Gemini 2.0 Flash, the latest iteration of its Gemini AI model. This release focuses on performance improvements, notably…
AI

Microsoft Research Introduces AI-Powered Carbon Budgeting Method: A Real-Time Approach to Tracking Global Carbon Sinks and Emission

3 Mins read
Since the Industrial Revolution, burning fossil fuels and changes in land use, especially deforestation, have driven the rise in atmospheric carbon dioxide…
AI

Evaluating Gender Bias Transfer between Pre-trained and Prompt-Adapted Language Models

1 Mins read
*Equal Contributors Large language models (LLMs) are increasingly being adapted to achieve task-specificity for deployment in real-world decision systems. Several previous works…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *