While we’re not quite to the point when Alexa can create movies in real time for us, artists in the AI (artificial intelligence) community have been creating movies with AI tools. And by the end of this year, anyone should be able to create photorealistic movies with tools like Runway AI (Artificial Intelligence), DALL-E, and Unreal Engine AI (3D motion tool).
Two weeks ago, I would have said that creating photorealistic movies was a few years away, but with the speed that AI is moving daily–it’s like trying to drink water out of a fire hose–photorealistic movie creating by the end of this year is a real possibility.
Here’s an example of how far AI movie-making has already come: Critterz — An animated short created by Chad Nelson and the DALL-E AI.
Creating Movies with AI | Fragile Soul
I’ve been working on shooting the live-action film Fragile Soul for several years. It’s taken a while because of budgetary and time constraints, but this year I completed additional footage shoots in the US and UK. With editing just about done, I wondered about all of the scenes I had to leave out of the movie due to these constraints.
This led to the question: are the additional scenes necessary, or would they add bloat?
Ultimately, I decided there were about a dozen scenes that would help with character-building and storytelling. And that’s the main reason I’ve decided to re-shoot the entire movie with AI when the technology can produce photorealistic results, particularly of humans.
Creating Movies with AI vs. Live-Action
It comes down to three things:
First, the easy part: A full 120-minute movie will cost in the six or seven-figure range at the very minimum. This price range assumes that you have friends and family willing to work for free or nearly free, use free locations, and you’re ok with it taking a long time to complete. Christopher Nolan used this method for Following, one of his earliest films.
Creativity is the question mark. I enjoy the process of movie-making with a team of people. But I also enjoy the new process of creating images and using the current state-of-the-art movie-making AI technology. I’ve taken a lot of notes and have plenty of behind-the-scenes (BTS) footage of the live-action version of Fragile Soul. I’ll be doing similar note-taking and outtakes/BTS for the AI version of the movie.
In the end, I’ll be judging this movie as I do with all movies I watch:
- Does it tell a good story?
- Are the characters well-fleshed out?
- Am I emotionally attached to the characters (love, hate, etc.)?
The additional examination: was one method substantially more fun than the other?
Frankly, with the significant time and financial savings related to AI movie-making, live-action movie-making would need to be much more fun for it to be worthwhile.
That’s my hypothesis. We’ll see what happens when the AI version is complete.