senckađ
Group745
Group745
Group745
Group745
Group745
Group745
EDITION
Global
USA
UK
AUNZ
CANADA
IRELAND
FRANCE
GERMANY
ASIA
EUROPE
LATAM
MEA
Thought Leaders in association withPartners in Crime
Group745

ART-ificial Intelligence: Leveraging the Creative Power of Machine Learning

30/09/2022
Post Production & Animation Studio
London, UK
380
Share
Steelworks’ executive producer, Chago Venable, on why AI-generated art should be seen as a helpful tool rather than a threat to human creativity

Above: Chago's AI self-portrait, generated in Midjourney. 


I have learnt to embrace and explore the creative possibilities of computer-generated imagery. It all started with the introduction of Photoshop thirty years ago, and more recently, I became interested in the AI software program, Midjourney, a wonderful tool that allows creatives to explore ideas more efficiently than ever before. The best description for Midjourney that I’ve found is “an AI-driven tool for the exploration of creative ideas.” 

If I was talking to somebody who was unfamiliar with AI-generated art, I would show them some examples, as this feels like a great place to start. Midjourney is syntax-driven; users must break down the language and learn the key phrases and special order of the words, in order to take full advantage of the program. As well as using syntax, users can upload reference imagery to help bring their idea to life. An art director could upload a photo of Mars and use that as a reference to create new imagery – I think this is a fantastic tool.

Once you understand the fundamentals of Midjourney, it is a painstaking case of trial and error, trying out different things until you get what you’re looking for. I love when Midjourney artists publish their work, sharing the syntax they used to develop the idea. That being said, if you were to try to replicate an idea by entering the same exact syntax into Midjourney, you would end up with entirely different results. This is because the software learns from interacting with you. Artificial intelligence of this kind becomes more intuitive the more we use it, and the same goes for us – we start to learn how the software thinks, which is probably the most interesting thing about it. It is a learning machine that learns from you, and in turn, you learn from it. 


Embracing AI-generated artistry 


I’m a producer, with an extensive background as a production artist, mostly in retouching and leading post production teams. I also have a background in CGI, I took some postgraduate classes at NYU for a couple semesters, and I went to college for architecture, so I can draw a little bit – but I'm not going to pretend that I could ever do a CGI project. A lot of art directors and creative directors are in the same boat, they direct and creative direct - especially on the client side - a lot of CGI projects, but don’t necessarily know CGI. Programs like Midjourney let people like us dip our toes into the creative waters, by giving us access to an inventive and artistic toolset. 

Traditionally, art directors would draw or sketch ideas by hand to communicate their vision, but even then they tend to be a rough outline that cannot possibly match up to the standards of a commercial artist. If an art director was working on a creative brief and wondered, “What does a futuristic aeroplane hangar look like on a moonlit night on Mars?” they might be able to envision it but there would be no tangible sketch or visualisation of the idea. Midjourney allows art directors, and other professionals in the pipeline who don't have extensive CGI backgrounds, to create visuals that help illustrate their idea. This can then be passed on to skilled production artists who will bring it to life. A lot of skilled artists can take a Midjourney image and manipulate it further in Photoshop or post production, to give art directors exactly what they're looking for. Working with the machine, we can develop our ideas together, which I think is really, really fascinating. Using Midjourney and other AI-driven programs can help us communicate ideas and flesh out different options, moving the creative process along more efficiently. 


Above: the different stages of development for Chago's AI portrait. From left to right: reference image (selfie), raw Midjourney output (after several different rounds of machine 'learning'), Liquify (done in Photoshop for a wider overall face), Photoshop Neutral Filters (to adjust proportions for lips, eyes, nose, forehead), final colour grade. Chago went through 34 different rounds of syntax with Midjourney, before landing on this syntax: ::photo realistic::portrait::black man::bald::wearing glasses::beard::in the style of ernie barnes -- iw50.


Making the algorithm work for you


Last week, the Steelworks team was putting together a treatment deck for a possible new project. We had some great ideas to send to the client, but sourcing certain specific references felt like finding a needle in the haystack. If we were looking for a black rose with gold dust powder on the petals, it is hard to find exactly what we want. It’s times like these when a program like Midjourney can boost the creative. By entering similar references into the software and developing a syntax that is as close to what you’re looking for as possible, you are given imagery that provides more relevant references for a treatment deck. For this reason, in the future I see us utilising Midjourney more often for these tasks, as it can facilitate the creative ideation for treatments and briefs for clients. 

In my opinion, we are quite a way away from relying solely on software like Midjourney to complete an entire project. The algorithm isn’t quite that advanced just yet and we can’t get too specific with it. However, in years to come, I would love to see some kind of handshake between Midjourney and CGI software like Maya, so that creatives could transfer what was created in Midjourney into a 3D space. That’s the dream.


Exploring creative capabilities 


I'm optimistic about Midjourney because, as technology evolves, humans in the creative industries continue to find ways to stay relevant. I was working as a retoucher during the time Photoshop first came out with the ‘Healing Brush.’ Prior to that, all retouching was done manually by manipulating and blending pixels. All of a sudden, the introduction of the Healing Brush meant that with one swipe, three hours of work was removed. I remember we were sitting in our post production studio when someone showed it to us and we thought, “Oh my God, we're gonna be out of a job.” Twenty years later, retouching still has relevance, as do the creatives who are valued for their unique skill sets. 

I don't do much retouching anymore, but I was on a photo shoot recently and I had to get my hands in the sauce and put comps together for people. There were plenty of new selection tools in Photoshop that have come out in the last three years and I had no idea about most of them. I discovered that using these tools cut out roughly an hour's worth of work, which was great. As a result, it opened up time for me to talk to clients, and be more present at work and home. It's less time in front of the computer at the end of the day. 

While these advancements in technology may seem daunting at first, I try not to think of it as a threat to human creativity, rather a tool which grants us more time to immerse ourselves in the activities that boost our creative thinking. Using AI programs like Midjourney helps to speed up the creative process which, in turn, frees up more time to do things like sit outside and enjoy our lunch in the sun, go to the beach or to the park with your kids – things that feed our frontal cortex and inspire us creatively. It took me a long time to be comfortable with taking my nose off the grindstone and relearn how to be inspired creatively. 

SIGN UP FOR OUR NEWSLETTER
SUBSCRIBE TO LBB’S newsletter
FOLLOW US
LBB’s Global Sponsor
Group745
Language:
English
v10.0.0