"Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” - "Ian Malcolm," in Jurassic Park.
So much is being written these days about the power and potential of artificial intelligence (AI) tools and their ability to redefine so much of our world. From automating simple, repetitive text, to drafting academic essays, and even changing how some people in our industry, like copywriters, may do their jobs. These AI tools are being explored everywhere.
But in the midst of all this, few people are stopping to think about the ethics of AI. If I can prompt an AI tool to write a document in the style of William Shakespeare’s sonnets using iambic pentameter, am I plagiarising from Shakespeare? If I ask an AI tool to take a picture and render it in the style of Jackson Pollock, am I stealing from his estate? If in rendering a piece of artwork the AI tool uses an Ansel Adams photograph, do I need to licence that photograph?
The answers to these questions aren’t clear. Indeed some writings and artworks of famous people may be in the public domain and exempt from such ethical (or legal) questions. But there are also countless works of literature, art, or commercial content that still have copyright protections, and some originators and artists are still alive and hold ownership over that work. And yet, AI tools today are still taking that content and using it to generate new, derivative content. In these cases, the ethics inherent in that new work of art are not well defined. Take the example of the ‘new’ song 'Heart On My Sleeve' by Drake and the Weeknd, which was generated by an AI tool, put on TikTok, and has become a hit. Who owns — or should own — that piece of music (and its royalties)? Should the song even exist?
Some AI companies are trying to find pathways through these murky ethical waters. Adobe is introducing an AI product called ‘Firefly’ that only uses material with no copyright issues as the content for its machine learning. Adobe claims that we can use this product to make AI-generated material free of ethical dilemmas (or legal issues that may arise). This may sound wonderful, but… many perceive the image results of Firefly's prompts to be not as interesting or creative as those produced by Midjourney or DALL-E 2, other AI products which use additional, copyrighted material to generate their results. The conundrum, of course, is that acquiring the legal rights to all that art is impossible today, especially given the way we’ve structured access to those images or styles.
This discussion is very similar to what occurred in the market for recorded music years before. ‘Bootleg’ concert tapes and illegally copied recordings were soon followed by services like Napster and other platforms that allowed for the often illegal downloads of .mp3 and other sound files. These platforms and new technology fundamentally changed the way that music was created, produced, sold, and listened to by its audience. In the end, we have landed in a place where companies like Spotify, Apple Music, and Pandora can legally sell their content to give music audiences what they want, rather than remaining tied to an old, and now outdated, business model.
Just as the music industry adapted to a fundamental change in the way that people both create and listen to music, the AI tools of today may be ushering in a new reality for the world of art, literature, and even commercial content. Just as Napster and its progeny challenged, then redefined, intellectual property rights related to music, writers and artists are entering an era in which they may be able not only to licence their works but also their style or mechanics of art and creation. Note progressions, paragraph structures, brushstroke styles, and colour palettes may soon become as legally defined and protected as the final works they comprise. Ad agencies may find ways to define and protect their creative processes and campaigns (to keep them unique for the benefit of their current and prospective clients). In a world facing the uncertainty of AI, will artists, writers, and other creatives start using blockchain to track their artistry and contribution? Will hyper ledgers or other distributed ledgers help track who owns what, in order to correctly attribute licensing and royalties?
At a minimum, it is critically important for our industry to recognise and address the ethics of these AI tools before we start using them commercially. This will not only offer legal protections for our agencies but will benefit our clients as well. We can all then sleep peacefully knowing that artists and writers are not being disadvantaged, our creative content is unique and protected, and we do not have to hide from possible claims of plagiarism or outright theft.
Veronica Millan is global chief information officer at MullenLowe Group. She writes a regular column about the metaverse for Little Black Book - check out the rest of the series here.