Lifestyle

image

Bernard Marr: What Is Artificial Emotional Intelligence?


....

Humans have always been able to claim mastery above machines when it comes to understanding emotion. But that won’t be the case for long. While some may doubt that machines will encroach on emotion, those working in the field of artificial emotional intelligence, also known as emotion AI or affective computing, say we’re well on our way, and by 2025, the global affective computing market will reach $174 billion.

What is artificial emotional intelligence?

When computers can read emotions by analyzing data, including facial expressions, gestures, tone of voice, force of keystrokes, and more to determine a person’s emotional state and then react to it, we call this artificial emotional intelligence. This ability will allow humans and machines to interact in a much more natural way and very similar to how human-to-human interaction works.

Affective computing began in 1995 in an MIT Media Lab when cameras, microphones, and physiological sensors gathered affective responses to identify emotion, and then machines responded to those emotions. This early work led to lab professor Rosalind Picard publishing “Affective Computing.” Today, a machine’s adeptness at evaluating data can help it pick up on subtle emotive nuances that some humans would miss.

How does artificial emotional intelligence work?

Through a combination of computer vision, sensors and cameras, tons of real-world data, speech science, and deep learning algorithms, artificial emotional intelligence systems gather data, and then process and compare it against other data points that identify key emotions such as fear and joy. Once the appropriate emotion is identified, the machine interprets the emotion and what it might mean in each case. As the emotion database grows, the algorithms get better at identifying the nuances of human communication.

How is artificial emotional intelligence used today?

As the field continues to mature, many companies are actively using it to provide better service and products. Here are a few examples:

Affectiva, an emotion recognition software company, helps advertisers and video marketers gather moment-to-moment facial expressions when watching a video with its product Affdex for Market Research. This data is compared to the company’s emotion

database, and benchmarks for sales lift, brand recall, and more to give its customers such as Kellogg’s and CBS ideas to optimize their content and media spend. The company is also helping the automotive industry figure out ways to use artificial emotional intelligence to transform the transportation experience, including road safety and passenger experience. This includes advanced driver state monitoring solutions to identify an impaired or tired driver as well as be a system for autonomous vehicles.

Another company supporting marketers with artificial emotional intelligence solutions for market research is Realeyes. They use webcams, computer vision, and artificial intelligence to analyze viewer’s facial expressions when watching videos. Realeyes is capable of giving critical feedback regarding the effectiveness of the creative used in ad campaigns that helps companies such as Coca- Cola and Hershey’s be more effective with their marketing.

Microsoft has a team that’s dedicated to developing new technologies that promote emotional resilience and well-being and will use artificial emotional intelligence to sense and respond to emotional states. The HUE (human understanding and empathy) team is challenged with bringing artificial emotional intelligence to Microsoft products, specifically in the areas of empathetic search, human understanding in gaming, and adaptive workspaces.

If you’ve ever been routed through a company’s customer service call center to resolve an issue and found your frustration level rising with each transfer, you will appreciate Cogito’s artificial emotional intelligence solution to improve the experience. It helps identify the caller’s mood and adjusts how agents handle the call in real-time.

Artificial emotional intelligence technology can be useful to identify mental health concerns. CompanionMx offers a mental health monitoring app that can identify signs of mood changes and anxiety when someone speaks into their phones. The MIT Media Lab created a wearable device called BioEssence that senses changes in heart beat to identify pain, stress, and frustration and then releases a scent to help the person through that emotional state.

Emotion AI can even be a form of assistive technology for people with autism who often struggle to pick up on the emotional cues of others when communicating. A wearable device could help those with autism “read” emotions of others and then respond appropriately to the other person.

Like with so many other artificial intelligence applications, developers must be mindful that the data sets the tech are trained on represent the diversity of our global community in order to be useful to all. This will hopefully happen as with the wider-spread adoption of these applications.

Artificial emotional intelligence is definitely a fascinating and promising field. I will be watching very closely over the coming years.

How do you think artificial emotional intelligence can be beneficial?