senckađ
Group745
Group745
Group745
Group745
Group745
Group745
Thought Leaders in association withPartners in Crime
Group745

Netflix’s ‘The Great Hack’: The Internet Confirms Whatever You Want to Believe

23/09/2019
Advertising Agency
New York, United States
678
Share
INFLUENCER: Proximity London's Sara Parrish on not so advanced data tactics

Over the weekend, I watched ‘The Great Hack’ on Netflix. I went in expecting to see a load of advanced data tactics that could only be executed by advanced data scientists at the top of their game. Instead, I was surprised to see strategies that didn’t feel advanced at all, but very familiar. Why? Because they’ve been used, albeit in a less sinister manner, in the ad industry for years now. 

The whole ‘are your phones listening?’ phenomenon has been a back-and-forth with friends who question whether their mobile is listening to them in order to serve up freakishly relevant adverts. The answer is no (at least not for ad purposes). But the data collected on you is so accurate that your behaviour and what you’re in the market for can be predicted. Arguably, this is more reliable and far easier than the effort of ‘listening to you’. There’s also a big confirmation bias at play here – where we see thousands of ads, but the one that catches our eye is just so dead-on that we think our phones must be listening.

‘The Great Hack’ highlighted how whistle-blowers zeroed in on Cambridge Analytica’s similar use of psychographics. They essentially isolated 'persuadables' using Facebook audience targeting, then bombarded them with right-wing propaganda. As a data strategist in the marketing space, this new documentary reminded me of how everyone in my field has a responsibility to draw a hard-and-fast line between the ethical and unethical use of targeting. If it ever feels a bit wrong, then it probably is. And we should challenge our clients and our strategies accordingly. 

Essentially, the manipulation of data and targeting in US politics can serve to teach the advertising industry some valuable lessons.

 

Have We Learned Anything?

As the next US election is fast approaching, I do feel that we're at least more aware that we've been played, and should know what to watch out for in regards to propaganda. However, my fear is that 'persuadables' can still be found as an audience and manipulated again for political gain. Additionally, more disturbing evidence from this documentary showed that information warfare, where propaganda is used as a 'weapons-grade communications tactic', is seemingly driven with a mostly far-right agenda. The stat shown at the end of the documentary revealed that Donald Trump's campaign ran 5.9M visual ads, in contrast to Hillary's 66K:

Really, my main concern is that we haven’t learned enough from 2016. Now that we know about online propaganda that aims to polarise us, will our ‘persuadable’ counterparts be manipulated again in the upcoming election? Or will we all do a better job this time around and avoid being psychologically tricked by content that was designed to evoke a reaction and bag a vote?

It’s important to remember that everyone has visibility to Live Ads on every Facebook page. This can be accessed by clicking through to Page Transparency and clicking on Ad Library. For example, Trump's currently running more than 50,000 paid ads in the US (at the time of writing). And at this time, one of his ads is targeting women to enlist for 'women for Trump'. You can see the budget spent, a full view of the ad, where he's targeting (clearly conservative/swing states) and how much he's spending:


I also checked out the Democratic candidate pages for Bernie Sanders and Kamala Harris, who both stand at around $2.7M each for ad spend in the past year. If you’re curious, go and look at your own preferred candidate and see what type of ads they're running and/or targeting in the run up to the election, and if it sits right with you. 

Whatever happens, we know that fake news and propaganda will still be a problem for the 2020 election, and we also have more data on who is spreading misinformation. Several recent tests have shown that the older and more conservative population are far more likely to buy into fake news, believe in conspiracy theories and be convinced by propaganda. An article written by The Verge a while back also researched this and confirmed that those over 65 were the most likely to share fake news, while a broader 18% of Republicans shared a fake news story, compared to 4% of Democrats:

Moving Forward, ‘I Didn’t Know Better’ Won’t be an Excuse

Much of this comes down to what The Verge describes as a 'digital literacy gap'. And there's a lot of merit to this, as it's our darling older gen who make up the highest percentage of victims falling for scams, and getting swindled out of their retirement funds with sinister email and telemarketing tactics. The job to be done now is to continue to educate people and yourself about always vetting sources. 

Do your due diligence and click onto the Facebook page of that seemingly extremist piece of content to check: 

· How long has the page been active?

· Are they running paid ads?

· Google it. Has it been trenched in any controversy?

· Does it feel too emotionally-charged instead of fact-based?

· Are the facts backed up by scientific evidence and linked through to any academic papers or reputable publishers?

 

Opinions and the Truth are Not One in the Same

I know the majority of us left school some time ago, but rewind time and remember how much we had to vet our sources in order to use stats or figures in our own work. I think we've lost that skill in our adult lives. Perhaps out of laziness, we just want to believe what's easiest. Remember, you are entitled to your own opinion, but you are not entitled to your own facts. Regardless of how hard you believe in something, it doesn't make it true. I always like to point people in the direction of checking their source through platforms like the Media Bias Chart:

They do a swell job of keeping this updated, so you can pick out the best unbiased, neutral sources to get news and information from. My personal favourite sources for great journalism are Reuters and The Economist. I know I can go to either of these and just get good, factual journalism without stressing myself with all possibility of excluded information and/or skewed POVs.

Another fantastic resource to run a questionable source through: https://mediabiasfactcheck.com/search/ which will tell you if the source is extreme-leaning and/or has published content that has been proven false or without merit. For instance, the famously divisive source, Breitbart, is labelled far-right on the chart above, and outlined as an ‘extreme’ and ‘questionable source’ by the Media Bias Fact Checker:


Misinformation Only Works if it’s Believed

It’s vital to remember: when the internet will confirm anything you want to believe, the onus is on us to dig out the truth.

With all this in mind, for us data professionals, draw your line in the sand and let’s genuinely work to keep pushing our clients towards a more transparent and ethical use of data. My job is to be the voice of the consumer, and as a consumer, I don’t want to be tricked, manipulated, or toyed with for anyone’s gain. My rules of engagement are as follows: 

1. Operate in complete transparency. If your entire targeting strategy became public, would it cause an uproar? If so, don’t use it. 

2. Ask yourself, would you appreciate it if this campaign were targeted to yourself? If not… it’s best avoided. 

3. Currently, privacy is a privilege in the US and a right in the EU. In light of that, self-regulation is imperative. Without clarity on what the global boundaries should be for leveraging personal data, can you say with a high degree of confidence that all the data you’re using was obtained with consent? And if you’re using methods of ‘implied consent’… don’t. 



Sara Parrish is senior data strategist at Proximity London

Credits
More News from Proximity Worldwide
80
0
ALL THEIR NEWS
Work from Proximity Worldwide
Find Red
M&M's
17/01/2024
10
0
18
0
ALL THEIR WORK