Get your own Little Black Book.

Build your own personal news stream. Discover the latest work created that intersts you, share your favourite stories and follow your favourite people and companies

Already have an account?

Opinion and Insight
  • 934

Beyond VR: 9 Other Bits of Tech That Have the Post Industry Excited

LBB Editorial, 4 months ago

Virtual reality is the trendy tech of today… but what other lesser-known platforms are revolutionising post production?

Beyond VR: 9 Other Bits of Tech That Have the Post Industry Excited

Virtual reality is the darling of the ad industry at the moment, as agencies, production companies and, perhaps most notably, post production companies race to get to grips with it. But behind the sexiness of this new technology is a whole host of new technologies, revolutionising the workflows and capabilities of the post production industry and, as such, the entire advertising industry. We caught up with the people in the know to find out what’s got their hearts racing right now.


What is it? The Lytro Light Field Camera creates photos that you are able to refocus again and again - even after you’ve shot them – opening up creative opportunities in post production.



Tawfeeq Martin, Technical Innovations Manager, The Mill LA: “Traditional cameras focus on single rays of light mapped to a rectangular screen. This limited perspective places four walls between us and total immersion. Fortunately, the renewed quest for VR supported through ubiquitous computing has inspired foundational rediscovery along the way. 
 
“Lytro Light Field Camera leverages the plenoptic function of their proposition to capture moving lightfields. Many samples of light in all directions allow for four-dimensional recreation of a scene with added benefits. Imagine the freedom to change the depth-of-field, focal plane, frame rate and shutter angle as well as reposition the camera, pull mattes based on depth, grade based on depth, and much more.

“Despite the physical size and demand of this budding technology, it is still in its infancy stage. But it remains as exciting an advancement as that of the Technicolor 3-strip camera in 1932.”



Duncan Malcolm, Director of 2D, Glassworks London: “Instead of ‘baking’ an image of each frame when shooting, Lytro’s light field video effectively captures a digital holographic representation of what is in front of the camera. This use of light ray technology opens up a box of tricks that up to now were either unachievable, or achievable only through technological cheats and many hours of prep work. 

“This technology allows focus, depth of field and variable shutter angle all to be set and changed post shoot, and with true optical correctness. The range of prisms which capture the light rays are such that true stereoscopic images can be made post shoot. These are perfectly correct as opposed to the post stereoscopic techniques that are now commonly used. The depth information captured during shooting also allows depth screen mattes to be easily created, and for slices of the image to be isolated in a pixel perfect condition in a composite environment. The complete extension of this is to render CG elements as comparable 4D layers, allowing whole scenes to be composited using true light fields, The overall optical look of the scene is completely flexible. Even after delivery. Amazing!”

David Elkins, VFX Supervisor/Lead Flame Artist, Scarlett: “Lytro’s Light Field camera opens the door to an infinite number of creative decisions in post that may otherwise be impossible after the footage is shot. Using the metadata, artists have the ability to control photographical elements of a scene like depth of field, focal length, perspective, aperture, and shutter angle in every frame. Along with this newfound flexibility, Lytro claims the tracking data would make the integration of CG elements into live action easier and more seamless, and the generation of a depth screen makes it possible to layer almost anything into a scene without matting it.”

2. Gaming Software

What is it? Software more commonly associated with game development has been taking the VFX world by storm thanks to the speed at which realistic images can be rendered. 

Kamil Dąbkowski, Head CG Supervisor, Platige Image Commercial Department: “For the past couple of years we've been observing a bidirectional transfer of technologies between the worlds of movie production and video game development. Game engines generate incredibly realistic imagery in real-time and astonish even the most sceptical of gamers. The line between CPU-rendered cinematics (a couple of hours per frame) and video game cutscenes (rendered in 60 frames per second (fps)) is getting increasingly blurred. Right now everything depends on the graphics processing unit (GPU) performance. Video card manufacturers are aware of the growing demand for new advances in GPU technologies and they keep on offering new solutions. One of the most demanding technologies to be introduced recently is naturally VR, which requires the simultaneous generation of two images, both with 70-90 fps frame-rates. This means VR requires at least twice the GPU efficiency required by an average AAA video game title. Nvidia has recently introduced a slate of optimisation techniques, including VRSLI which employs two GPUs hooked in SLI mode, each card generating one image. 

“In companies like ours, rendering engines commonly used in game development (including Unity and Unreal) are also extensively utilised. We use them for animatic production, look development, and to create VR experiences.”


What is it? Google and software developer Autodesk collaborate on a rendering solution that brings the benefits of cloud rendering to individual studios and artists. Artists no longer have to be tied to an actual piece of hardware to perform the task.

Aron Hjartarson, Executive Creative Director, Framestore LA: “We love virtual reality, but there are massive things happening at an infrastructural level informing everything post-production shops are creating. Traditionally, VFX artists have been tied to physical workstations, render farms, and disparate pieces of content from multiple sources. As cloud-based platforms continue to advance we're no longer tied to the same data storage, workflows, or processor speeds. Powerhouses like Google and Autodesk have teamed up for lightweight cloud-based services like ZYNC Render. 

“How is this exciting for a company like Framestore that already has established infrastructure? Isn't this lowering the barrier of entry and giving everyone access to what gives Framestore an edge over smaller outfits?

“Framestore isn't about brick, mortar, or technological infrastructure but rather about building teams that work together according to the ethos of its culture. That is our competitive advantage. Lower barriers to entry for individuals simply deepens the pool of talent we have to work with, and all our teams, large and small, will benefit greatly from this new tech.”


What is it? The latest version of the world’s most popular colour grading system.



Adrian Winter, VFX Supervisor, Nice Shoes Creative Studio: “These days VFX and post artists across all disciplines have been taking on more tasks, often leading artists to become familiar with multiple programs to keep up. Software developers have begun to respond, developing tools and workflows which allow for more integration from one tool to another. 

“One of the exciting examples I saw of this at NAB was from Filmlight, who have incorporated a number of tools into Baselight 5 which give colourists much more creative control over the shots they are grading. These include a grid warper, paint tools for logo removal and blemish correction, perspective tracking, stronger keying tools, and perhaps most exciting: support for Flame Matchbox shaders. I think these features will go a long way towards allowing colourists to do more comping and clean-up in session.”


What is it? The latest version of the 3D visual effects software.

Ton Habraken, Visual Effects Director | Founding Partner, Ambassadors: “VR is obviously a very tantalising and visible development but there is so much progress out there – which might not be as sexy – but definitely has the people in the visual effects world excited. The latest version of Autodesk’s Flame is just one thing that has me excited. Flame has been the go-to tool for creative visual effects in the high-paced advertising industry and now, with their 2017 release, they’ve added a lot of new tools and technology which are also being used in real-time game engines. This means that a lot of effects are close to being real-time and this gives us, as artists, much more time for the creative process as opposed to having to wait until a render is done.”

6. Augmented Reality 

What is it? Realty enhancing technology currently sitting slightly in virtual reality’s shadow.

Simon Gosling, Creative Evangelist, Happy Finish: “I'm often asked the difference between augmented reality (AR) and virtual reality (VR). Put simply, AR brings digital content into the real world via a screen, whether that is a tablet, smartphone or poster, whereas VR uses a headset to place us in an altogether different and virtual environment.

“While VR has been grabbing the headlines, AR has quietly sat in the background, but is nonetheless providing brands with just as exciting ways to share their stories with consumers.

“Only this month after 91 years in print, The New Yorker magazine leapt into the future with an augmented reality cover on the May 8th issue. Created by artist Christoph Niemann for the magazine's annual Innovators Issue, the simple yellow and black cover comes to life when viewed through a tablet or a smartphone, turning into a three-dimensional animation that lifts off the page.



“The next generation of augmented reality and virtual reality is mixed reality (MR), which uses a transparent headset to bring holograms into the real world.  Companies like Microsoft Hololens and Magic Leap, are creating technology to transform the way we interact with data, entertainment and information. Imagine, one day in the future, being able to sit in a Globe Theatre in Manchester, watching holograms of live performers, as they act on stage in London's Globe Theatre. 

“A long time ago, in a galaxy far, far away, a hologram of Princess Leia pleaded, "Help Me Obi Wan Kanobi. You're my only hope!" Well, in a town near you, right here and now, holograms are becoming a lot more sophisticated, and you're going to be seeing a whole lot more of them!



What is it? An incoming version of an image tracking software used by VFX artists.

Adrian Winter, VFX Supervisor, Nice Shoes Creative Studio: “One exciting development comes from Imagineer, makers of Mocha Pro. In version 5 of Mocha Pro, Imagineer has developed plugin versions of Mocha for After Effects, Premiere, Avid and Nuke that can be applied to shots directly within their respective host applications, eliminating the need to import and export data between two different programs.

“While better time management and efficient workflows might not be the sexiest of topics, it cannot be denied that these days deadlines are shrinking. Tools that allow for increased efficiency help the artist, which in turn allows the artist to better service their client. I am pleased to see software developers recognize and react to this need, and am excited to see what further developments will be coming in the future.”

8. Shared VR

What is it? The integration of virtual reality experiences for a group to experience together, overcoming the common criticisms of existing VR set ups as isolating.

Jon Collins, President of Integrated Advertising, Framestore: “We did a bus for Lockheed. We built a Sesame Street style schoolbus and built it into a VR headset, which McCann and Lockheed are pitching as the first ever shared VR experience. It was one of the most progressive things we’ve ever done, probably the most stressful job I’ve done in 30 years and when I talk about risk, it’s when you’re absolutely on the edge of technology. We were building technology that, frankly, doesn’t even exist. That was hugely risky. We pulled it off and delivered this incredible experience in Washington DC.”



What is it? Sculptures that respond to the wind and make buildings ripple. 

Tim Dillon, Executive Producer, VR, MPC USA: “Not everything needs to be digital; just take a look at these amazing kinetic material sculptures that use wind to displace the surface to create stunning results.”