Fully designed using AI-generated visuals with DALL-E (a deep learning model developed by OpenAI to generate images from prompts), Critterz features a range of cute characters in an animated science documentary turned comedy that introduces an unexplored forest inhabited by mysterious little Critterz with unforeseen personalities.
In this interview with LBB, Chad and Nik share the origins of this creative project, the role of AI in creativity and how DALL-E tech enabled them to achieve incredible results.
LBB> Critterz is a first-of-its kind AI-powered animated film built using OpenAI’s DALL-E technology. What was your reaction when this brief came to you and what were your initial creative ideas?
Nik> Chad was actually the first person to ever show me Dall.E – it was totally mind blowing. He’s an early adopter of all sorts of things and so there wasn’t as much of a 'brief' as the intention to do something cool. We’re both pretty busy, so it was a matter of finding the time to work on something cool on the side – and I’m really glad we did.
LBB> How did the concept develop from there?
Nik> We’re only about a year into the world of text-to-image and a lot of the video examples we were seeing felt more like tech demos than true storytelling. Since we are storytellers at heart, we decided to take a different approach and focus on world building. Could we use AI to actually craft IP?
LBB> What was it like collaborating with OpenAI on its first commissioned film?
Chad> First, it was amazing to see their excitement for the project! For months Dall.E had been generating millions of still images, but only a few artists had experimented with it for video and most of those projects were simply R&D demonstrations. This would be the first true short film and the team at OpenAI couldn't be any more supportive.
LBB> What were some of the creative and production problems/challenges on this project and how did you find a solution?
Chad> There were two primary challenges that we had to overcome from the start. The first was bringing to life the 2D still backgrounds so they felt more three-dimensional. We solved this using traditional parallax animation techniques with tools like After Effects. The second was getting each of the characters to speak and there were two approaches we experimented with hand-animation and facial performance capture. There were pros and cons to each, but in the end, we found facial performance capture won out. This was primarily due to scale and cost, as once we had each character setup and rigged utilising the Unreal Engine, we could then record any number of lines of dialogue without much impact to the budget or schedule.
LBB> The film will demonstrate how AI can empower creatives to better achieve their vision. Can you tell us about how this works?
Chad> Creativity is such an interesting process as there is no 'one size fits all.' And when that spark goes off in a creative's mind, the next question often becomes, "How long will it take to realise that vision... to visualise it?" This is where AI is becoming the most transformative. The speed at which creatives can ideate is now unrivalled. Using a tool like Dall.E, you can visualise fully-rendered ideas every 15-seconds with four variations each time. Never before in history have we been able to explore ideas at that speed, allowing creatives to push the boundaries of ideas further as the time/cost factors have diminished.
LBB> What were your personal highlights from this project?
Chad> My personal highlight was seeing Blu the Red Spider speaking for the first time with the talent's voice over and all the animation in sync. That was the moment we left "experimenting with technology" and switched to witnessing a character brought to life!
LBB> What is your favourite scene and why?
Nik> The intro is pretty phenomenal and truly a love letter to the magic of Dall.E… but I also love when the Critterz start talking about merch and lunchboxes. We cut to Frank and he’s all swagged out in Critterz gear. I love a nice sight gag and shows we’re not taking ourselves too seriously.
LBB> How did you feel when you saw the final edit? What reactions have you received so far from audiences and the client?
Nik> Seeing the film go out into the world has been fascinating. There’s a larger discussion about AI obviously at play right now and we’re all ears. The good, the bad, the many many questions. We felt it was important to have a companion behind-the-scenes film that our agency Native Foreign also produced to help discuss the process. The sentiment has been largely positive and we can’t wait to share what’s next up our sleeve.