senckađ
Group745
Group745
Group745
Group745
Group745
Group745
Creative in association withGear Seven
Group745

One Green Bean Highlights AI's Alarming Gender Bias

08/03/2023
Advertising Agency
London, UK
89
Share
An experiment by the Havas agency using the popular platform Midjourney revealed a clear bias toward depicting men in highly-paid jobs

Ahead of International Women’s Day, creative comms agency One Green Bean has revealed how emerging AI generative image tools consistently under-represent women across senior professional roles.

The agency undertook a two-part experiment using Midjourney, an artificial intelligence platform that has surged in popularity in recent months. The program generates images based on decisive text prompts, trawling an estimated five billion images from the web for each search.

One Green Bean first asked the platform to generate images based on the job titles of three members of the global leadership team – managing director, executive creative director and head of public relations EMEA. The results revealed a clear male gender bias.

To extend the experiment further, the team then ran top 20 highest paid jobs in the UK as ranked by The Times newspaper, through the Midjourney platform. 88% of the images reinforced male gender stereotypes. From chief executive to locum consultant, tax partner to aircraft pilot, artificial intelligence revealed an overwhelming bias towards men.

In the UK workforce, less than a third of the UK’s top jobs are filled by women according to Fawcett's Sex and Power 2022 Index demonstrating that AI appears to reflect a distorted reality.


Above: The UK's top 20 highest-earning jobs, as depicted by Midjourney.


"There’s been huge hype around AI tools like ChatGPT and Midjourney. We’ve been deep in experimentation to understand its potential, but an eye-opening limitation became clear very quickly", says Kat Thomas, founder and global ECD of One Green Bean. "A distinct gender bias is very evident, with favorability consistently skewing male. That’s not the only bias either. When you do include ‘woman’ in your key words, imagery tends to be sexualized – big boobs, unbuttoned shirts, pouting lips. Another huge bias is around diversity, the images these platforms generate overwhelmingly skew white, as well as male. Our industry is obsessed with artificial intelligence and, whilst embryotic right now, its capacity is revolutionary. However, it’s not without its limitations and its bias against women is a significant hurdle these platforms need to overcome. They effectively hold a mirror up to society, demonstrating that ingrained cultural biases dictate the norms that machine intelligence currently relies on.”

Further research desk research using Midjourney revealed gender bias across a huge variety of disciplines. When ‘International Tennis Star’ was typed into the platform, four shouting men appear, with no sign of Serena Williams or Ash Barty. Other roles the teams looked at included air traffic controller, paramedic, finance manager, police officer, train driver… all of which returned images of men.

More News from Havas UK
Work of the Week
Work of the Week: 19/07/24
19/07/2024
516
0
Hires, Wins & Business
McCann and Havas CX helia Appointed by LNER
18/07/2024
90
0
ALL THEIR NEWS
Work from Havas UK
Feel-Good Summer
English Heritage
01/07/2024
13
0
ALL THEIR WORK