With the Internet of Things, we are now connected to products in ways like never before. The fact that our coffee machines can link to our smart watches that are linked to our sleep cycle apps, make the products we have intuitively sync up to our every move. As Sophie Kleber the executive director of product and innovation at Huge insightfully observed, we have reached the end of the “terminal world.” We simply don’t need screens to interface with technology anymore. Instead, machines have learned to communicate with us -- as we’ve seen with Apple’s Siri and Amazon’s Alexa. Voice as an interface is becoming mainstream. And as more intuitive interfaces are starting to emerge, these new technologies will naturally integrate into the rhythm of our lives.
Today we stand at the dawn of seamless and ubiquitous computing. This development is all the more important because of the advent of affective computing, where devices are starting to recognise, interpret, process and even simulate our human behaviour. By scanning our voice patterns and facial expressions, this new technology, can analyse language inputs, look for emotional undercurrents and micro expressions to decode our true thoughts and intentions from biometric feedback.
There are already a few companies pushing the boundaries in this space. Beyond Verbal is an emotional analytics company that has developed software that can decode human vocal intonations and underlying emotions in real time. They ran their software on a video of Steve Jobs talking about the launch of the iPad and it picked up on underlying feelings of loneliness, emotional frustration, stubbornness, and childish egoism. These are deeper emotions than most human beings can normally perceive at the surface level, but just think about the implications for decoding client feedback.
Soul Machines is another company that develops intelligent, emotionally responsive human-like avatars. Their latest project “Nadia” is a virtual assistant voiced by actress Cate Blanchett, designed to help empower people with disabilities. Through your computer’s web camera and microphone she can see and hear you, while intuitively interacting with you as if she were a human by responding to changes in your voice -- ultimately giving the user a more emotional connection.
President and Founder of the Ogilvy Center for Behavioural Science Christopher Graves stated that the brain responds in a more empathetic way to emotional narratives. Emotional narratives allow you to connect with an individual in a way you can see yourself in them; These narratives are fully immersive experiences that are more persuasive than just cold facts alone. In fact, Graves cites that 90-95% of the decisions we make are based on emotion, not logic.
As emotionally intelligent machines further evolve, this will create a powerful opportunity for brands to develop personalities that can forge deeper relationships through an emotional connection with consumers.
Designers have always created emotional objects – from smiley-faced cars, to loveable cuddly objects – all meant to trigger some emotion inside of us. But, as more brands want to play in this space, and create emotionally intelligent machines, designers will inevitably need to craft psychologies and personalities for brands. Emotional psychology will become a mandatory field of study in the design field and intelligent machines will evolve to play roles in our lives that we never thought possible.
Mindy Benner is Senior Designer at 72andSunny LA