In 2021, there were 85 million suspected pieces of child sexual abuse material (CSAM) found online. And, as of 2022, this number grew at a rate of one piece of CSAM being uploaded every two minutes.
In light of this information, the Canadian Centre for Child Protection contacted No Fixed Address Inc., with the purpose of creating a campaign to draw attention to these shocking, and extremely upsetting statistics. For the project’s ACDs, Andrew Rizzi and Daniela Angelucci (who have since moved on to freelancing), the realisation that this problem was not only bigger than all the world’s film festivals, combined, but was literally a ‘film festival’ hiding in plain sight, proved a jumping off point.
To bring ‘The Unwanted Film Festival’ to life, an AI algorithm was created to produce one film poster every two seconds - representing the speed, scale and size of the issue. Film titles were all inspired by real survivor stories. A website showcased these posters, and provided additional information on the subject. The site also shared petitions, calling on tech companies to block CSAM from their platforms.
LBB’s Josh Neufeldt sat down with Andrew and Daniela to learn more about how this campaign was brought to life.
LBB> The growth and spread of child sexual abuse material online is both an urgent and sensitive issue. What was the brief like?
Andrew> We knew we had to come up with something unconventional that would get people’s attention. This problem is one that’s simply ignored by the vast majority of the public, due to the fact that it’s so uncomfortable to talk and think about, and because people feel like they can’t do anything about it.
Daniela> The original brief focused on the skyrocketing growth rate of child sexual abuse material (CSAM) online year over year, and how it affects survivors well beyond their abuse. The damage stays with them for years because this material doesn’t get taken off the internet - it continues to recirculate online causing survivors to be re-victimised over and over again.
LBB> What was your research process like for this campaign? What key data points and takeaways came from it?
Daniela> Our team at NFA worked closely with C3P on this. We found that there was a 30% increase in CSAM content from 2020 to 2021 – translating to 85 million pieces reported in 2021, with one being uploaded every two seconds. It’s a haunting statistic and we knew that it was so big that it wouldn’t be humanly possible to show this amount in any way, which led us to the idea of using AI to bring it to life.
LBB> What is your relationship with the Canadian Centre for Child Protection like? How did the idea for ‘The Unwanted Film Festival’ come to pass?
Andrew> NFA and the Canadian Centre for Child Protection have a strong, long-standing relationship as they’ve worked on several campaigns together over the years. For Daniela and I, we’ve had the opportunity to work with them on several projects including a campaign to help prevent teenage boy sextortion in 2017 called 'Don’t Get Sextorted, Send a Naked Mole Rat’, and then this year with ‘The Unwanted Film Festival’.
What’s great about our relationship with the incredible team at the Canadian Centre for Child Protection is that there is a ton of trust on both sides. They trust us so much and welcome ideas that push the envelope, and we trust them when they tell us that something might go too far.
Daniela> After getting the brief, we went back and did what we always do, which is digging deep into the problem and trying to learn as much as we can about it. We then spent time ideating before presenting a range of ideas. One of those ideas was ‘The Unwanted Film Festival’. Essentially, we thought about how massive this issue is - how many pieces of material are flagged online every year - and asked ourselves, ‘how can we quantify that in terms that would be understood globally?’, because this is a global issue. We used the stats we had found and came to the idea that all of this content being uploaded online is larger than all of the world’s film festivals, combined. It’s the world’s largest film festival hiding in plain sight. That lens can be understood across the world.
LBB> Building on this, the name ‘The Unwanted Film Festival’ is very memorable - a dark but accurate way to represent the situation. Was there any debate, or was this the first and immediate name choice?
Andrew> We had a list of names in the running. We knew we wanted something that would stop people in their tracks and make them think, ‘that sounds like a film festival but something sounds off, why?’. It had to have that connection to a film festival while also relating directly to the problem of CSAM. There was a film piece released by NFA and C3P earlier in the year called ‘Unwanted Followers’ that also dealt with this subject matter. I think that connection also made sense for us.
LBB> A big aspect of this campaign is the website, which uses an AI algorithm to create one film poster every two seconds. Tell us more about this! Who was involved in making this happen and how many posters have been generated at this point?
Daniela> First, in terms of creating the posters, we started off by reading survivor stories, articles on CSAM, and listening to podcasts to gain as much knowledge about the problem as we could. We then took those real survivor stories and turned them into film posters. We created 50 titles and 50 taglines in six languages, 90 images, and 66 different fonts. We worked with the incredibly talented design, digital and dev teams at NFA who helped bring the idea to life. We then leveraged AI to build an algorithm that created one film poster every two seconds. This data combined to output 85 million unique posters inspired by real survivor stories. Since launch, there have been almost five million posters uploaded on unwantedfilmfest.com.
LBB> Another integral part of ‘The Unwanted Film Festival’ was the information spot that came with the website’s release. Who was involved in making it, and how did you go about bringing it to life?
Daniela> We wanted to create something that felt like a trailer for the idea - something to explain what this is and why we’re doing it. We worked with Fort York VFX and School Editing to bring it to life, and both did such a wonderful job crafting every detail.
One of the main pieces of the idea and the launch video was ‘The Unwanted Film Festival’ logo. We took a look at film festival logos and common elements across the festival landscape and recognised that the wreath is a main staple. So, we thought, instead of wreaths, what if it was made out of children? After we designed it, Fort York created this stunning 3D animated version and the video was edited in a way that allowed the viewer to see the detail of the children before cutting out to reveal the logo as a whole.
LBB> The music in the ad is super grim - which feels appropriate for the subject matter. Who did the sound and music for the spot?
Andrew> Berkeley did the music for this project and really overdelivered. We gave them the brief - that we wanted something subtly eerie and haunting - and they came back with some fantastic options. The one we landed on was everyone’s favourite. The little details they brought to the table were so smart. Towards the end of the video you can hear this crackling, buzzing sound which is super eerie. That sound is actually a hard drive, which is the main device used by people to upload and exchange CSAM online. It’s a small detail, but wildly smart and powerful.
LBB> What has the response to the campaign been like?
Daniela> A good gauge for the response was our launch in New York City, during the Tribeca Film Festival (we timed our launch to coincide with Tribeca to give context to a much darker film festival that’s happening all around us, every day). We worked with Street Attack to create an immersive experience at Lume Studios, which put the scale of the problem into perspective, and put people right in the middle of it. When visitors entered the experience they became surrounded by AI-generated film posters through 3D projection mapping. Every two seconds, a new poster was built and rapidly filled the walls and the floor of the space - representing the speed at which CSAM is found online.
Andrew> We spoke to hundreds of people who stopped by. Some knew nothing about the issue before seeing this, others knew it was a problem but didn’t realise it was this big. We even spoke to survivors who shared their thoughts about being appreciative that we were doing something powerful to spread awareness of the issue. At the end of the experience, we gave everyone a movie ticket styled business card with a QR code on it that, when scanned, led them to our website where they could sign the petition to demand that tech companies stop the upload of known CSAM online.
LBB> What challenges have you faced during this initiative? How did you overcome them?
Andrew> It’s such a sensitive issue that we had to be cautious at every touchpoint of the campaign. Our team worked collaboratively to make sure everything was vetted properly so that we didn’t trigger anyone, while also making sure our message was effective.
Daniela> It’s such a dark issue which no one wants to talk about, so finding a way to get people to pay attention and do something was challenging, but completely necessary.
LBB> Is there anything you’ve learned or taken away from the experience of working on a campaign like this? Please share!
Andrew & Daniela> Mainly how massive this problem is. I think almost everyone knows that CSAM is online, but to know how rapidly it’s growing and that it’s not just found on the dark web - it’s found on the open web all over the place - was eye opening.
LBB> While you guys did this campaign at No Fixed Address Inc., you’ve since gone freelance. With that said, is there the possibility of you working with the Canadian Centre for Child Protection on future projects?
Andrew & Daniela> We have always enjoyed working with the inspiring and relentless team at C3P, so who knows what’s possible in the future? But, we also know that the relationship between them and NFA is strong, so we’re sure there will be more powerful campaigns to come from that partnership in the future.
LBB> Is there anything you’d like to add?
Andrew & Daniela> If you’re reading this, please check out the website, sign the petition and share the video. More awareness and more signatures means more pressure will be put on tech companies to prevent the upload of CSAM on their platforms. While we’ve sparked crucial conversation, the fight to shut down the ‘unwanted film festival’ goes on.