AI (artificial intelligence) isn’t just science fiction anymore.
These days, people are interacting more and more with AI-backed chatbots and virtual assistants to check schedules, read the news, and control music and lights in their homes. In fact, according to research done by Fjord Australia and New Zealand, Accenture Interactive’s design and innovation branch, 52 percent of global consumers interact with AI-powered live chats or mobile apps monthly, and 62 percent claim that they’re comfortable using intelligent virtual assistants.
The first phase of AI already leverages machine learning and data analytics to provide a more personalized user experience by giving these automated ‘bot’ technologies access to customer profiles, purchasing histories, product preferences, and so on. As companies develop the next generation of virtual assistants, the holy grail is to teach them to be able to recognize and react to human emotions. The end goal is to enable these bots to continually learn and adapt, ultimately making the machine-to-human interaction as natural and fruitful as possible.
The 411 on AI and emotions today
In the early days of Apple’s Siri, released way back in 2011, it wasn’t uncommon to hear people yelling at their iPhones when Siri didn’t understand a question or gave an irrelevant answer. And though the technology has improved vastly over the years, Siri still isn’t able to recognize if a user is angry or happy with her.
When Microsoft introduced Tay.ai in 2016, the chatbot was designed to learn and then emulate human emotions based on public data on the Internet, along with editorial input from staff. However, less than a day after its release, when Tay started to spew a myriad of racist and inflammatory ‘opinions’ on Twitter based on what she learned, Microsoft was forced to pull the plug.
Despite the unfortunate and highly-publicized setback, Microsoft and many others are still hard at work on baking emotional intelligence into their respective AI tools.
The holy grail is to teach AI to be able to recognize and react to human emotions.
Facebook, for instance, has obtained three emotion-based AI patents in the last three years. One will be able to gauge a user’s emotional state based on how they interact with a keyboard, mouse, touchscreen, or other input device, i.e. typing speed, how hard keys are pressed, or the nature of gestures on a touch screen, location, device movement, and other factors. Another can recognize and respond to a user’s reaction to content by reading facial expressions through smartphone or laptop cameras. This will help Facebook deliver more appropriate responses, and ultimately optimize user engagement with the app. The third patent will use AI to match the user’s emotions to a corresponding emoji.
How emotionally-intelligent AI fits into retail
One of the most frustrating things about current AI is that it doesn’t know when you’re frustrated—or angry, or even happy or excited. Since the majority of retail applications of AI are customer service-oriented, this is a real problem. It’s possible this limitation may be slowing adoption of the technology or even increasing customer contacts in the near term. Until the tech has advanced to better recognize facial expressions, tone of voice, or typing force, what can you do to make sure AI is working for you and delighting your customers?
One of the most frustrating things about current AI is that it doesn’t know when you’re frustrated. Since the majority of retail uses are customer service-oriented, this is a real problem.
The immediate solution is to train the bot to recognize certain language patterns or context so it can respond more appropriately or escalate to a human at the right time. IBM’s Watson, for example, can understand enough syntax to be able to pick up on sarcasm in some cases. Keeping the scope of your bot narrow will also limit the number of issues it won’t know how to handle. Implementing a simple blacklist/whitelist approach (preferably after analysis of real-world customer communications) can also address a good chunk of use cases for your bot to handle appropriately.
Building emotional intelligence into AI is not an easy task, and efforts are still at a relatively early stage of development. However, forward-thinking retailers should think about priority applications where this capability would be most important to enhance the customer experience and reduce friction. Emotional intelligence is less critical when providing product suggestions, for example, versus when helping a customer with a damaged item.
Simplifying the consumer experience with emotionally-aware bots is a win-win for all. Not only will customers be more satisfied, retailers will be able to gather more data and refine user profiles to better serve them in the future.