senckađ
Group745
Group745
Group745
Group745
Group745
EE - Robot Barber
09/09/2021
Post Production
London, UK
0
Share
Credits
Brand
Agency / Creative
Production
Post Production / VFX
Editorial

When Saatchi&Saatchi and EE devised the world’s most challenging high-performance 5G network test involving the remote-controlled robotic shaving of a popular actor, there was only one partner who could bring its utterly unique combination of precision technologies into reality with the style it deserved.

For this world-first, The Mill created a bespoke real-time tracking and robotics system to record the movement of a London barber’s hand in real-time using optical tracking and translate it 25O miles away to a customised robotic arm that would deliver a wet shave to Lucifer star, Tom Ellis with millimetre accuracy. On a mountaintop. Only using the public network. No excuses.

The EE network managed both real-time tracking data and a crucial video call to guide the barber’s movements over long distance as it was filmed live. Only the tightrope combination of EE’s powerful network, real-time tracking and precise robotic control could make this cutting-edge demonstration possible. A testament to all three production partners that EE rightfully celebrated:

“As the world begins to open up again, we want customers to feel inspired about what they can do when armed with the EE network. We shoot our campaigns live over our public network, to show what’s possible, no matter how extreme. No smoke and mirrors, this really happened. Go more places and do more of what you love, knowing that, like Tom, you can count on our high-performing network in the moments that matter.”
Kelly Engstrom, Brand and Demand Generation Communications Director at EE
The custom software and hardware configuration was developed by The Mill’s Creative Technology team, led by Technical Director Noel Drew. Amongst the myriad challenges and innovations, Noel picked out the intricacies of making it look real as well as being real as the issue that worried him most:
“Perfecting the real-time movement of the arm was unbelievably tricky. With the shoot happening in such remote locations it was essential the movement was right. It had to be believable and that came from it moving like a human. It’s a matter of detail. Very subtle but unmistakably human nuances. It’s actually very easy to tell the difference between a computer controlled robot and one being mapped in real-time to a human. At one point I could see the arm moving up and down gently and panicked that the data was drifting, until I realised it was the person in control breathing while holding the blade still. It was amazing, and that level of accuracy really paid off.”