AI; emotions as the connectors between us and the AI engines (2/3)

By George Achillias

AI; emotions as the connectors between us and the AI engines (2/3)

Ai & emotion 3 of 3

AI tries to be something completely different from what we have experienced until today. Larry Page, the CEO of Alphabet, the mother company of Google, said some weeks ago that “Artificial intelligence will be the ultimate version of Google. The ultimate search engine that will understand everything on the web. It will understand exactly what you want, and it will give you the right thing. We’re nowhere near doing that now. However, we can get incrementally closer to that, and that is basically what we work on.”

In 2013, a movie named “Her” came out. Falling in love with a computer or meeting your perfect partner from a keyboard wasn’t a new thing for a sci-fi cinema. But in this movie, we had a tailor-made hyper-personalised operating system interacting, “feeling” and creating a deep relationship with a human. It was quite interesting to watch, the leading actor, Joaquin Phoenix who was playing Theodore Twombly, a twitchy, tortured soul hiding behind tortoiseshell spectacles holding down a day job ghost-writing personalised (e-)letters, for those unable to put their emotions into words falling in love with the Samantha. But what was more revealing was the ability of Samantha to gain Twombly’s trust and emotions.

He felt for the seductive and conforming skills of an electronic office organiser that she was able to read him like a book — and to offer unquestioning allegiance in the wake of his failed marriage to independently minded Catherine (Rooney Mara). One minute Twombly was having functional conversations with his OS, aka Samantha, about cleaning up his inbox; but it didn’t took long before he started having that kind of cybersex that it has been proved the preferred method of interaction for isolated online humans and woanna be lovers the world over.

But, as Twombly grew in confidence and get back to its feet, so did the self-named Samantha, the infinite portals of the virtual world enabling her to become so much more than the sum of his parts.

As he was falling for Samantha, so Twombly did withdrawn from the world, an awkward encounter with Olivia Wilde’s blind date (and a failure to put his arms around Amy Adams’s unhappily married “friend”) signifying how far he had drifted from human interaction. In the film’s most toe-curling sequence, Samantha, the named operating system, organised a sex session whose physical presence is all kinds of wrong, suggesting that the last thing Theodore would want was a human interaction and touch. Meanwhile, the system started producing errors that once crashed Theodore’s marriage began to re-surface like a recurrent computer virus.

And we reach the point where we start talking about emotions, machines can develop and how this can affect our human way, to apprehend and experience the world. Currently, machines are able to map and understand emotions but not to perceive and feel intentions. Researchers know that recognising intent is a hard problem for machines and they are working on it. What would be of a paramount importance, is the moment machines will start sensing emotions and reacting accordingly from the human dialogues, or by listening to what people are saying about them or even by understanding facial expressions in depth.

 

Machines are on the verge of receiving abilities to feel, understand and recognise human emotion, to respond to it with human characteristics so that negative feelings are less likely to be created. This as a result will enable them to build other additional capacities that can be used to help people develop and assess various abilities as well as behaviours that will contribute to emotional intelligence.

These skills, (quite interesting that Amazon name each new feature for Alexa, a skill), are shaping the way software and applications interact with humans. Agents will show apprehension of human feelings and take steps to be less frustrating, while applications that deal with customers will show respect for their emotional case and situation by acknowledging them though adjustment of their conduct in response. Somehow we have reached a point to say what Girard believed for the humans to the machines.

Girard’s big idea was the so-called ‘mimetic desire’. Human beings are born with a primitive set of needs, food and shelter. As soon as these fundamental necessities of life are acquired, we look around us and observe what others are doing, with the main purpose of copying them. In Thiel’s summary, the idea is ‘that imitation is at the root of all behaviour’. Based on this, we can say that machines are in that level beyond satisfying their basic needs and they start feeling as well as imitating human behaviours, habits and emotions.

The significant rates at which people are cursing at their machines as well as “colleagues hurling abuse at their machines,” should all drop. Experience with information technologies should become less frustrating and more productive. As a consequence, this will help brands to start producing hyper-personalised experiences and universes.

The challenges organisations are facing are driving the type of personalised experience they should offer to customers. Implicit personalisation, which tracks visitor’s conduct on the site and presenting content based on clicking patterns is an option that the majority (65%) chooses. Only the 37% of brands are focusing on building up a profile of each customer to then use to push relevant and appropriate content.