senckađ
Group745
Group745
Group745
Group745
Group745
Group745
Behind the Work in association withThe Immortal Awards
Group745

Your Shot: Machines Created, Directed, Cast & Edited This Music Video

29/06/2016
Advertising Agency
Los Angeles, USA
522
Share
Saatchi & Saatchi’s Andy Gulliman and Team One’s Chris Graves on hiding an artificial intelligence-made film within the 2016 New Directors’ Showcase

Each year at Cannes Lions, the Saatchi & Saatchi New Directors’ Showcase features a live show conceived around a theme that mirrors the most exciting happenings in the ad industry. This year it was less a live show, more an experiment. In a nod to the rising prominence of artificial intelligence, Saatchi & Saatchi, its LA agency Team One and Zoic Labs set about potential capabilities of the technology. 

The result is a music video that was creatively conceived, cast, directed, shot and edited wholly by machines. Saatchi & Saatchi hid it within the NDS reel and tasked the audience to try and pick it out among a crop of films by some of the world’s most talented new directors. We can’t speak for every attendee, but LBB’s Addison Capper certainly didn’t guess correctly. 

Instead he caught up with Saatchi & Saatchi’s Worldwide Director of Film & Content Andy Gulliman, and Team One Chief Creative Officer Chris Graves to find out that and a whole host more about how this beguiling project came to be. 

NB: Sadly, Saatchi & Saatchi aren’t permitted to show the film outside of the Palais in Cannes, but there is a behind the scenes video and various pictures for context within the interview. 


LBB> What inspired you to launch this project to go with the NDS? 

AG> We put this brief out to the network for a theme for the year and this idea came back from Team One in LA. It seemed like a cool idea but could they really execute this? When they proved that they could, I just said “are you serious?”. We were fascinated by it. A lot of people pitched VR stuff to us but it just didn’t live up to this. When Team One presented the concept and capabilities of AI and how they were going to do, it was almost too great a risk to not do it. We could do this exercise, keep it indoors and show it off to our clients. But the whole thing is so in keeping with the NDS, so we decided to share it on our biggest screen. 

CG> The Saatchi & Saatchi New Directors’ Showcase always wants to do something bold that has never been done before. We knew that VR would be the hot topic, so we went beyond to Artificial Intelligence.



LBB> Andy, how was it for you, as a producer, to put down the tools and let the machines do their thing?

AG> Weird. Everything that’s ingrained in you as a producer, you couldn’t apply. There was no real treatment, just the lyrics fed to the AI. We got an artist to sing the song while undergoing facial recognition and linked up to an EEG headset. The machine took the data of her emotional changes as she sang, and that is how the lyrics were translated to the machine’s language. Then during the casting, the people were asked to wear the same equipment and stand in front of the camera. The machine chose the person who had the most similar data readings as the artist. As a producer, you collaborate and discuss who works best. But I couldn’t do that, it was a matter of the machine saying “number four”. As a producer, you’re meant to be in control, almost like the axle while everyone else is the spokes to wheel. It was really difficult, but exciting too. 


LBB> And did you agree with the machine?

AG> Bizarrely, yes. 


LBB> Chris, can you give us a brief overview of how it was made?

CG> We assembled a cast of artificial intelligence, algorithms and machines the same way a studio would assemble a traditional film crew.

Different shots from the film


LBB> How many and which pieces of tech were used?

CG> The short film ‘Eclipse’ uses several technologies in a never-before-seen combination from start to finish:

- IBM Watson and Microsoft’s AI chatbot Ms_Rinna (Microsoft Rinna) registered the emotion behind the lyrics to generate a completely original storyline for the music video.

- In addition to helping provide the storyline, Ms_Rinna was asked opinions on characters, wardrobe, location and catering for the shoot.

- The team used Affectiva’s facial recognition software and EEG data to help cast the perfect co-star.

- Drones gave direction on the day of the shoot by using a combination of data from IBM Watson’s tone analysis and Affectiva’s facial recognition software. This data allowed the drones to capture intense emotional moments with mathematical precision.

- AI was used again during the edit. The team created a proprietary program that identified which clips to put where based on the beat of the song and the emotional intent of the lyrics.

- All of the visual effects were created using a custom neural art program. This program allowed the machines to apply a filter to the raw footage based on reference images selected based on the artist’s vision.


Behind the scenes images of the production 


LBB> When you first set about this project, how much did you know about feasibility, capability, processes, etc.? And what did you have to learn along the way?

CG> Los Angeles is the heart of where entertainment and technology intersects, so Saatchi & Saatchi looked to Team One, and we immediately brought in Zoic Labs to help us pull this experiment off. Like any true experiment, we did research, and formed hypotheses, and did a lot of testing and learning along the way.


LBB> What do you think to the quality of the final production? 

AG> On the final day as it was all going through and I was watching edits come back, I wasn’t as excited as I was last year when I was getting Jonathan Glazer or Michel Gondry’s film in. It was a stream of amazing films. [In 2015, for the NDS’ 25th anniversary, 25 previous alumni were asked to create one minute of a 25-minute experimental film.] 

But I screened the whole reel a few days before the screening and invited an EP from MPC to watch it with me. And he couldn’t guess which film was created by the AI. That’s when I got excited. 

The 'director'


LBB> Now that you’ve worked on this, how far can AI go within the ad industry? 

AG> It’ll be part of what we do - just like how we rely on laptops and iPhones. There will be a way to tune it to suit our needs. We’ve not run this experiment [the New Directors’ Showcase one] in response to the fear factor of people losing their jobs to machines. This is more about looking at how AI can support what we do. If we work with it then we can build it and adapt it to our needs. But I think it’s just another technical facility that we will call upon, and how we call upon it is up to us.

Credits
Work from Team One
Battle Out There
Lexus
02/10/2023
13
0
Note Symphony
The Ritz Carlton
27/09/2023
16
0
Open Closed
The Ritz Carlton
27/09/2023
11
0
ALL THEIR WORK