Everybody’s
emotions fluctuate throughout a 24-hour period. On the 8th November 2016, the
emotions of a whole country, and possibly the world, fluctuated on certain
moments:
06:00 EST –
Excitement – Polling stations start to open
17:00 EST – Joy – Hack identified on Trump’s campaign website
19:00 EST –
Anticipation – Polling stations start to close
22:25 EST –
Uncertainty – First swing state result comes in
22:30 EST –
Panic – Canadian Immigration website crashes due to traffic volume
02:00 EST –
Confusion – Clinton rally ends abruptly with no winner announced
02:35 EST – Shock/Fear/Delight (delete as appropriate) – Donald Trump elected 45th President of the United States
That’s a lot
to go through in one day. Emotions will continue to rise and fall over the
coming weeks and months, but this isn’t anything new. 2012 was dubbed by many
as the first digital election, but in 2016, digital was even more important.
People used social to share their opinions on every aspect of the election,
including the candidates, who at times used it to ‘speak’ to each other:
But social is only the start. Think about all the news articles, speeches, debates and interviews, and we are looking at a huge volume of data. As an analyst, my mind always asks, “how can I understand what this means and what can I do with it, now?” Social listening tools help capture a lot of the text content, but the analysis, at speed? I think back to 2012 and a live dashboard from Brandwatch outlining the key trends through that election. But what about video content? What about the images and what they mean? What are people feeling?
During ITV’s election coverage, they showcased a newly created AI system, EagleAi, to leverage this data throughout the night, utilising all mentions since the July nominations. Built off of IBM’s Watson Analytics, the tool digested text, images and videos, examining language, sentiment, imagery, personality, tone and confidence. Social listening tools continue to get smarter, but right now, no one is even close to matching this.
In most cases, if you tell me about a tool understanding sentiment and language, I’ll give you my spiel about how systems can’t understand what sentiment the word ‘bad’ has in a tweet, let alone the tone that is being used. But when it comes to Watson, I am no longer surprised.
I have seen multiple uses of the system recently that shocked me in how far AI has evolved over such a short period of time. Watson can analyse content from seven websites and spit out a personality for each business, matching what we already know and believe. So now I’m thinking, ‘how far can this go?’ EagleAi has taken what we’ve been used to in the past and put an asterisk against them all in terms of the understanding of data and I have little doubt it can find similar emotions at a similar time to what I said above. But I continue to ask, what can I do with it?
Data for me is intelligent when it is used to inform decisions. What intelligence can be taken from this? Whilst it was a great tool to show the value of data on the night (and hopefully tell me my assumptions were correct), the question from me is whether EagleAi had the ability to correctly predict the outcome of the election, something few polls were able to do. But polling is a group of individuals with their own opinion. Social listening is a game of he who shouts loudest wins. News articles have their own agenda at play as well. I doubted with all the talk of a Clinton win before polls closed last night that ITV would bring on a tool that would suggest otherwise, but I was wrong. EagleAI correctly predicted the outcome, so where does that leave me?
There is still a need for the human element of data analysis. AI may predict the answer to the question it is given, but the context and what to do with it is just as important, and whilst AI continues to improve, it is driving me, and those like me, to think faster, and smarter.
Tom White is Head of Data Intelligence at iProspect