Collider Studio used Unreal Engine to build a playable, open world game used to create two music videos for H3000’s debut album. The films feature the musical duo - Luke Steele (Empire Of The Sun, The Sleepy Jackson) and Jarrad Rogers (Charli XCX, Lana Del Rey) - as playable characters in the form of their own sci-fi avatars.
The two videos are part of a single story in two parts. ‘Running’ features the duo trekking across desolate black tundra before being chased by an ominous group of ‘hunters’. The second video, ‘Flames’ - released several months later- goes back in time to show why the hunters are so desperate to stop the duo before they make it back to their ship.
Not only the avatars, but the entire playable universe was ideated, designed, and developed by the Collider Studio team. These are some of the first major music videos of their kind to be created entirely in Unreal Engine as a playable game format for this purpose.
Before the characters were ready to be dropped into the virtual set, every piece of the game map was generated and virtually scouted on foot in-engine, across what in real-world terms would have been a 6x2 square kilometre set.
The avatars were modelled in ZBrush, while iPhone face scanning software was used to capture their heads in a simple, low fidelity way to work in the game engine. Their bodies were mapped onto an existing Unreal locomotion ‘rig’ character that could be controlled by an xbox controller.
Collider’s Technical Director Hugh Carrick-Allan created a system that allowed endless experimentation and iteration of camera coverage and capture.
With the virtual set taking 25mins to run across, to streamline the capture process the game map was divided into chaptered sections where the characters could be dropped directly into any of these areas quickly to work on those scenes. After recording the characters’ actions the team could activate replays, using a standard in-game spectator camera, to play the action through.
Another aspect that made this music video production process unique was the ability to apply AI systems to drive events as many video games do.
With all the colour grade done in-engine, editor Adam Wills was provided with a huge amount of ‘in-game’ footage to build out the edits in a traditional way as you would with shot footage.
Making the clips within an open world, real-time format meant Collider could embrace the serendipitous possibilities of this technology. And while some of the traditional creative processes stayed the same (eg. pre-scripting and post editing) rather than tightly storyboarded shots and only creating to those moments, using Unreal Engine allowed for the creation of a more open journey route for the duo in the hope of finding unexpected things along the way.