When ‘Saturday Night Live’ (SNL) celebrated its 50th anniversary earlier this year, it offered fans the chance to watch the show in a way they’d never done before. In addition to broadcasting the three-hour special on live television, it released a number of sketches in immersive 360-degree video, designed for watching whilst using a Meta Quest VR headset or on YouTube.
Marking five decades on air, the special brought together a host of guests, celebrity talent and some of the original cast members for a night of celebration, comedy and musical performances. Offering fans a glimpse into the action going on beyond the stage, the immersive content allowed fans to place themselves within the experience and look around the set to spot reactions from audience members and catch the camera operators and crew hard at work.
Not only that, however, but the content had to deliver the look and feel that SNL fans have come to expect – encompassing both the glitz and glam of showbiz and the fast-paced energy of a live performance. To achieve this, production company Light Sail VR, alongside Meta, SNL and NBC/Universal enlisted the expertise of colour grader Jeff Sousa, and his boutique post production company, Dungeon Beach.
Diving into the finer details of the complex project, LBB’s Abi Lightfoot spoke with Jeff to find out more about the intricacies of immersive post production, the experience of colour grading whilst wearing a headset, and the importance of bringing 360-degree video to a wider audience.
Jeff> Immersive post means forgetting what we know about ‘flat’ image viewing and rethinking how colour and audio should play when you are meant to be inside of an experience. Practically speaking, this means making creative decisions while wearing a VR headset. Once you enter that virtual world, you realise that traditional filmic looks don't quite work. Things like grain, muted highlights, or faded black levels become distracting. Instead, maximising dynamic range – making bright lights feel intense, and letting dark corners drop off into nothing – gives you that sense of ‘really being there’.
Of course, at that point, there is still much grading to be done, but it becomes more about defining and unifying the colour palette and relighting the 360-degree space to help focus the audience, since they can otherwise look anywhere they want!
Jeff> It began because we were using an unusual piece of colour grading software called Mistika Boutique, which is an alternative to DaVinci Resolve. One of Mistika's main features is that you can grade while wearing a headset. I met Matt Celia [co-founder and creative director] of Light Sail VR during a Mistika webinar, and he hired me to work on an adidas immersive experience. We went on to collaborate on many VR pieces, such as Emmy-nominated ‘Red Rocks Live’ in VR, and ‘The Faceless Lady’, a six-episode horror series produced by Meta, which is the longest piece of VR narrative content to date.
Since Dungeon Beach is a mid-sized boutique, we need to differentiate ourselves from both the big post houses AND the freelancers working from home. Immersive is interesting because the big houses aren't invested in it, and it's too much of a tech lift for individuals working from home. Coming to our Brooklyn studio, where you can grade while wearing a headset and do a fully immersive audio mix, is a good fit for us. We've had entire agency creative teams seated on our couch literally passing around the headset and giving notes while I grade, while also being able to watch a preview on the big screen monitor in HDR.
Jeff> I was hired on the project due to this long-standing collaboration with Light Sail VR (the production company for this project). However, being a New York-based post house was also pivotal, as Matt came straight to our Brooklyn studio the morning after the show taped. We were able to post three sketches within about 72 hours – a miracle for VR content. Because it's all shot in stereo 3D, there is a necessary first step in post production of aligning the left and right eyes so that the 3D effect feels natural. And for grading, you need to make sure any vignettes or shapes you create to focus the audience's attention are drawn equally on the left as well as right eyes. On top of that, we also had to uprezz some of the footage to 8K in order to maximise sharpness and detail. We were very much under the gun!
Jeff> The main brief was, understandably, to make the show feel like SNL! (Which is not a very stylised look – it is, in fact, a classically balanced broadcast look). Initially the main challenge was unifying the look between the various VR cameras used in filming. Some angles were shot on the Canon R5C, while others were captured on the META VR camera system.
Matt's team had to be creative in capturing the show, since it was being taped simultaneously for broadcast. So, they strapped VR cameras on TOP of the regular SNL crew cameras, and hid others backstage. With that said, getting the different cameras to match, for instance, Steve Martin's skin tone, was difficult.
On the creative side, I really wanted to capture the ‘liveness’ of SNL, particularly the intensity of the stage lights. That's, in part, why we decided to grade natively in HDR using Dolby Vision. So you can really feel the heat and glamour of those lights.
Jeff> I think the coolest aspect of this project is that you, the viewer, can choose where you want to look at any given time. You can look straight ahead, observing the sketch as it unfolds, or you can turn around and catch an off-the-cuff reaction of an SNL crew member. Or, if you're into celebrity-watching, you can observe the significant other of whoever is performing on stage and enjoy their reactions. It truly feels like you have a backstage pass to the legendary ‘Saturday Night Live’.
But, because the viewer can be looking anywhere, I had to do many targeted adjustments to the audience. Not only did we have to bring their levels up so their reactions could be appreciated (they're sitting in the dark), I had to make sure everyone's skin looked nice, because the same look that worked for the sketches – under bright and controlled lights – did NOT work for the audience.
Jeff> I do always grade while wearing a headset, and that is unique to my workflow compared to other colourists, who would typically grade on a monitor and then review in headset.
Using software developed by Light Sail VR called VR.NDI we can cast our colour session, live, to the VR headset. It's important to judge the colour there, because the Meta Quest has a bigger colour space – called P3 instead of Rec709 – and if you don't grade with it on, your colours will end up looking too saturated. Also, once you put the headset on, you appreciate how much brighter you need to push your highlights and how much darker you need to push your shadows in order to feel like you're truly there.
Of course, when you have the headset on, you can't see the colour control surface or your keyboard and mouse. So, in a way it's like being Luke Skywalker – blindfolded and ‘using the force’, where you keep your hands on the rings and balls and kind of feel your way through the grade, avoiding typical tools like curves and qualifiers where you'd need to keep your eyes on a computer screen.
Jeff> The initial idea with grading HDR was to maximise the dynamic range in order to truly feel the brightness of the stage lights. But what's amazing with Dolby Vision is once you create your HDR master, you can automatically generate versions optimised for YouTube and for the headset, since both have unique colour spaces. This allows you to grade once and generate versions which feel fully (but not overly) colourful for both platforms simultaneously.
Jeff> Honestly, the positive reaction in the YouTube comments were the most exciting aspect. Often when we work on immersive projects, so many people are fans, but then again, others are skeptical or reticent, simply because it's a new medium. To see over one hundred thousand views for some of the sketches, along with glowing reactions to the format, really gave me a boost of additional inspiration to continue down this path of specialising in post for immersive content.
Jeff> Even though I graded via the headset, Dolby Vision ensured that the YouTube experience was just as vivid. That was important because, so often, people aren't able to see or appreciate our work due to the requirement of owning a headset. I think making excellent YouTube versions of this immersive experience really allowed a much, much wider audience to not only enjoy our work, but sample what 360-degree content is all about.
You can check out the immersive experience here on YouTube.