senckađ
Group745
Group745
Group745
Group745
Group745
Group745
Virtual Production in association withThe Immortal Awards
Group745

Using Real Life Environments in a Virtual Production with Brett Danton

21/09/2022
Group745
Production Company
London, UK
377
Share
The director shares the benefits and challenges of using Omniverse and Unreal Engine, and why he believes both should be in everyone's creative toolbox, writes LBB’s Nisna Mahtani
Group_3646


Born in New Zealand and raised in the UK, Brett Danton spent the first two decades of his career working as a photographer. His interest in photography and moving image have culminated in him becoming a director to provide creative, stills and TVCs for commercial campaigns and projects. 

Due to the technical similarity of shooting stills and moving image, and his background as a director of photography, the director avidly adopts the latest creative technologies - lecturing on behalf of Canon and helping the camera company develop its workflow, equipment and tech. Represented by NM Productions, his portfolio of work includes campaigns for car companies Jaguar and Land Rover, as well as music videos for the pop-rock band, Bastille. 

Having become fascinated with virtual production after recent experience with it, Brett sits down with LBB’s Nisna Mahtani to share his thoughts on the industry’s two leading production softwares, Omniverse and Unreal Engine, as well as to explain how real life environments are used to support the virtual process.



LBB> When did you first get into virtual production?


Brett> I got thrown into the deep end on a first job with Hogarth and WPP, where they wanted to examine where they could take virtual production and what they’d see in the future. That was interesting because what we decided to do is go up to Scotland and LiDAR scan in the forest, and we tried to see how high of a resolution we could take into a virtual world. 

We looked at two different cases. The first was if it could work as a purely digital environment and if it could be used inside of volume stages as a background. The second and final thing we realised – and this is no matter how good your digital environments are – is that you still need traditional imagery backplates shot on camera. So, we had to figure out how to work backwards to make the digital assets work with traditional assets.

I’d never done any 3D work up until then, but we went off and used Omniverse which is a path tracing renderer where the reflections are true to life, and it’s what I work on today. We scanned an entire forest and no one had done it before! It took four weeks to open the files, but we could see where the potential was. 

If you look at most worlds that are made purely from CG, they’re overdone. So this technique also works if you want to use it as a background reference because all of a sudden, you have a background which is real, it’s organic and you can drop actual, real assets over it.


LBB> So, the original project you worked on was a car driving through the woods?


Brett> Yes! So that was used in videos and a keynote speech at GTC [a tech conference] as well, and we were pretty honoured by that. However, with the rendering of the Volvo campaign we did, if we went back and did it now, it would be 20 times better. It was done a year and a half ago as a test on a beat up piece of software. The software has come along a lot now, new pieces have been released and the rendering is much better. It’s much more stable.

For me, that project was the first time I’d seen anything and as far as I was concerned, the light looked right, the reflections were correct, and that was my excitement surrounding it. As a DoP, I can then take that scene, where the forest exists in Scotland, and I can light it in the way we’d normally do on set. Better actually, because if I want to suddenly put a 100-metre softbox above it, I can do that inside a virtual world.




LBB> What are some of the most exciting parts of virtual production as you look to the future?


Brett> Suddenly, you can do things which you always dreamt about doing. Yes, getting a scan in is quite expensive and time consuming - there are a lot of things to work out - but once you’ve got it, you’ve got everything else that goes with it. If you need to shoot a car, you need lighting and generators, and the budget also starts going up and up. If you need to rig a 30-foot softbox over the top of a car in the middle of a Scottish forest, you can do that on your desktop. All of a sudden, my desktop has become my entire studio. Originally it felt like sacrificing the look and the lighting, but now, I don’t feel that’s happening – the light is reflecting back on the surfaces of the cars, for example. It does need to be pushed on, but it seems that every three months, it just keeps getting better and better.

Once AI can come in and analyse the scene, I’m sure things will improve too. Presently, the software struggles with things like grass, so someone has to work to tell the software what it is. I’ve also spoken to lens manufacturers, who I think should sell a virtual equivalent of physical lenses so that when you use them on a shoot, you can use them in a virtual world. This way, you can have those lenses associated with a DoP. We said it wouldn’t happen with stills but it did, so I think that’s what is opening up. It’s also that Omniverse has an aspect of physics in it, so you can see things like the suspension moving and setting the car off, and the camera chasing it.


LBB> Talk to us about the software, you’ve mentioned Omniverse but are there any others you use, such as Unreal Engine?


Brett> We’ve been jumping around between different pieces of software. Omniverse is the one we’re using at the moment, but we just finished a project for the metaverse, which funnily enough was inside Unreal Engine. For that project, Unreal Engine sat better - it’s a faster real-time render and the technology was able to use frost rim on a LED wall. So they each have their pluses and minuses, and I personally believe people need to learn both.




LBB> Many say you can teach yourself how to use virtual production software. What would you say to that?


Brett> So the job we did with Bastille, I’d never used Unreal Engine before that. I kind of taught myself, although I’m sure anyone who actually knows how to use it would take a look at it and have an absolute heart attack. But it’s out there. The video clip that we rendered was finished as a 360 VR piece, but we used the same one to reproduce another single that we did inside the city, which we built in 360 but put out as a normal clip. The music video received 650,000 views in three weeks - and it’s from someone who hadn’t picked up Unreal Engine before.

I’d say both of them are a steep learning curve. I could see the industry was heading here around five years ago and I gave the tech a go, eventually, I gave up. But when the job came along, I concentrated and taught myself. Also, depending on the job, you may only need to do a mock-up, after which there are specialists in every area to support you. There are different ways of doing things, depending on what your budget is.


LBB> When you’re talking about virtual production, which parts of it are you speaking about? And why does it naturally become an amalgamation of different aspects?


Brett> It’s what I call a big conglomerate. What we’re trying to do is get a pipeline that goes from the real world, all the way to virtual production. So the assets we’ve created, and what we did for the Bastille piece on Unreal Engine came from shooting the band against the environment in a volume stage with a big LED wall and a 1.5 pixel pitch screen. We had fans live streaming as avatars from around the world, so we went with this screen because we could virtually hold that sharpness - whereas, with some of the bigger dots, it doesn’t come through as sharply on the camera. 

Because of this, the band could literally reach out and touch the fans while they were appearing as avatars. Then, for the VR experience, we took that and put it back into billboards that were built inside the virtual space and rendered it in 360. In short, that was accomplished by using every form of technology going. If they [the band] wanted to, they could do another virtual concert inside of that city featured on the billboards. The whole idea was to become a music experience, and that was when people were really looking for other ways to release music.


LBB> There are now things like virtual location scouts and conversations about intellectual property which have come into question. Can you tell us more?


Brett> It’s an absolute minefield. The thing is, we don’t use the original footage. We might use some beautiful wide aerials, but you don’t know where those angles are in the world. We can also scan the environment and change things. I can take a mountain and move it to the left, or I can say I don’t like that field and I’ll drop it into another field. So when you do all of that, then suddenly your location doesn’t look like your location. I guess there are two ways of doing things. You can either scan assets and get them yourselves, or you can buy a virtual asset - like a virtual Leaning Tower of Pisa - to drop in. In that case, however, I’m assuming you’d have to pay royalties.

However, there are softwares which are really easy to use, where you can go out and LiDAR scan with your iPhone, drop it into a 3D world and block it all out. That was designed for DoPs who want to create a physically accurate model. Yes, it’s pixelated, but it’s good enough for most things.


LBB> What are some of the obstacles or aspects you’re trying to overcome at the moment?


Brett> The biggest thing is the amount of computing power that you need. That, to most people, is a massive stumbling block. The other is getting the different softwares to talk to each other and capture devices, because what we find is that nobody has an out-of-the-box solution, so you end up with five different scanners and types. Each software has its own proprietary software and data acquisition, so what’s supposed to be a universal format ends up being none of them talking to each other. Because of this, we spend a long time trying to get everything to mesh, and that workflow is one of the hardest things we’ve done. 

Also, the equipment wasn’t designed for all the different kinds of work. Suddenly you’ve got industrial mining equipment which you’re trying to use in the film world. Merging everything together is what we’re working on. Things like covid-19 slowed the process right down because we had to wait for hardware or new pieces of gear to come along. Sometimes we have a conversation and then agree to revisit it in six months when new hardware comes out.


Credits
More News from NM Productions
ALL THEIR NEWS
Work from NM Productions
Group745
Made to Play
Nike
13/03/2024
7
0
Group745
Beauty
Sainsbury's
06/03/2024
2
0
Group745
Golden Hour
Monica Vinader
06/03/2024
6
0
ALL THEIR WORK