What Is Artificial Emotional Intelligence & How Does Emotion AI Work?

Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions.

Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an “a,” not an “e”). In psychology, an “affect” is a term used to describe the experience of feeling or emotion.

If you’ve seen “Solo: A Star Wars Story”, then you’ve seen the poster child for artificial emotional intelligence: L3-37.

Lando Calrissian’s droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion. Lando (played by Donald Glover) is also injured during the getaway.

The “woke robot” demonstrates the ability to simulate empathy by interpreting the emotional state of a human, adapting its behavior to him, and giving an appropriate response to those emotions.

Now, this example might lead some video marketers and advertisers to think that emotion AI is science fiction. But, it is very real.

A number of companies are already working to give computers the capacity to read our feelings and react, in ways that have come to seem startlingly human. This includes Affectiva, an emotion measurement technology company that spun out of MIT’s Media Lab in 2009, and Realeyes, an emotion tech company that spun out of Oxford University in 2007.

So, how do their technologies help brands, agencies, and media companies improve their advertising and marketing messages? Let’s tackle this question by examining how affective computing works.

How Does Artificial Emotion Intelligence Work?

Brands know emotions influence consumer behavior and decision making. So, they’re willing to spend money on market research to understand consumer emotional engagement with their brand content.

Affectiva uses a webcam to track a user’s smirks, smiles, frowns, and furrows, which measure the user’s levels of surprise, amusement, or confusion.

It also uses a webcam to measure a person’s heart rate without wearing a sensor by tracking color changes in the person’s face, which pulses each time the heart beats.

Affectiva has turned this technology into a cloud-based solution that utilizes “facial coding” and emotion recognition software to provide insight into a consumer’s emotional responses to digital content. All a brand or media company needs are some panelists with standard webcams and internet connectivity.

As viewers watch a video, Affectiva’s product, Affdex for Market Research, measures their moment-by-moment facial expressions of emotions. The results are then aggregated and displayed in a dashboard.

Affdex for Market Research also provides video marketers and advertisers with norms that leverage Affectiva’s extensive emotion database and tie directly to outcomes such as brand recall, sales lift, purchase intent, and likelihood to share.

These norms benchmark a video or ad against ones from competitors – by geography, product category, media length, and repeat views. About one-third of the Fortune Global 100, including brands such as Kellogg’s and Mars as well as media companies like CBS, have used Affdex for Market Research to optimize their content and media spend.

By comparison, Realeyes uses webcams as well as computer vision and machine learning to measure how people feel as they watch video content online.

First, a brand, agency, or media company selects a specific geography and audience segment that it wants to test.

Next, the Realeyes system provides 300 target viewers, who watch videos on their own device anytime they choose.

Then, the system’s algorithms process and analyze facial expressions in the cloud and show results on a dashboard within 24 hours.

The reports provided by Realeyes combine both creative testing and media planning insights to enable video marketers and advertisers to understand how consumers feel about their video content.

This enables brands (such as Coca-Cola, Hershey’s, and Mars), agencies (like Ipsos, MarketCast, and Publicis), as well as media companies (such as Oath, Teads, and Turner) to optimize their content and target their videos at the right audiences.

The marketing potential of this new technology is huge. Realeyes estimates that 70 percent of marketing outcomes are driven by the effectiveness of the creative, but only 10 percent of budgets are invested in that key success driver across the industry, which means a lot of money is currently being wasted.

And Mordor Intelligence estimates that the global emotion detection and recognition market will witness a compound average growth rate of 32.7 percent over the next five years.

If you’re interested in doing a deeper dive into Affectiva, then you should read Raffi Khatchadourian’s article in The New Yorker entitled, “We Know How You Feel.” Khatchadourian interviewed Rana el Kaliouby, Affectiva’s co-founder and CEO. She’s a passionate advocate for emotion AI and is on a mission to use artificial emotional intelligence to improve how people interact with technology and each other.

Mihkel Jäätma, the co-founder and CEO of Realeyes, was in New York City last week and I got the chance to interview him. He is passionate about humanizing the big data that increasingly govern our daily lives. He sees it his mission to bridge the gap between the academic research into human emotions, the latest in machine learning, and actionable business applications. Here is a transcript of the interview.

Interview with Mihkel Jäätma, the co-founder and CEO of Realeyes

Greg Jarboe (GJ): Realeyes just announced a $16.2 million round of funding. Will this enable you to create breakthrough technologies instead of incremental improvements?

Mihkel Jäätma (MJ): I would say definitely breakthrough developments. What we’re doing is essentially teaching computers to understand our emotions. So, as far as breakthroughs go, this is as big as it gets.

Being able to understand and quantify the human mind is very much the final frontier – and our tech will help us get there sooner. This is what we’re investing in, and I think it will completely change the way people interact with technology every day.

Right now, if you look at our business, we’re very focused on the marketing performance space and disrupting how marketing is being done. We’re very happy with that focus.

There’s a lot to do to help marketers utilize video more effectively, and we’re having fun rewriting the rules of how it can be done simpler and with people more at the center of it. But what this investment will do is help us to look beyond the marketing space and look at how we can use this technology more broadly in other industries.

For example, healthcare is one of our priority areas going forward, particularly mental health. If our devices can understand when we are stressed or about to get depressed, then that can have a really positive impact on society as a whole.

It’s really interesting, and we are currently putting together some key partnerships to help roll that out. I think everybody has come to understand that AI is going to play a bigger and bigger role in our lives – and we want to ensure it has a positive impact.

GJ: So, why is this the right time to expand your particular business?

MJ: So, there are a number of factors. Firstly, there’s never been a better time because of the sheer number of connected devices that are available now. Last year was quite a milestone because it was the first year when we had more internet-connected webcams on this planet than human eyes. So, this infrastructure hasn’t been around until now at that scale, and it’s only going to continue to grow.

The other element is the technology behind computer vision and machine learning, which help those cameras to decode our emotions, has also taken a major step forward (also driven delivers I mean, we have been working on this for a decade, so there’s been a number of breakthroughs that we bring to market to actually make use of those cameras.

And, now that we’ve been running it for a fair few years, we’ve also built up a huge database, which is in turn fueling these algorithms to become even better. So, it’s really the first time in history when all of those three things have come together – and that’s creating great momentum for us.

GJ: So, are you starting with a big share of a small market?

MJ: Well, we are pioneers in a completely new market, so we currently have a big share of an industry which is projected to grow very fast indeed.

GJ: Do you have the right team in place?

MJ: Absolutely. Our team is everything. So, currently more than half of our team work in R&D because we want to make sure we adapt quickly to the demands of the market. But, it’s not just about building the tech, it’s also using it right. So, the whole execution team that we have put in place is critical as well. We’ve brought in some new senior people as well, such as our new COO, Barry Coleman, who is a very experienced manager who runs the company across the four countries that we’re in and help make people work together more efficiently.

We also have some new people joining in our go-to-market teams, plus Prof. Maja Pantic, who has been working with us for four years as a scientific advisor and will be helping us more as we expand into new sectors, such as healthcare.

GJ: So, do you have a way to not just create but deliver your products to market?

MJ: Yes. I have an incredible team that deliver our products to clients every day. We also work closely with agencies and media platforms to help distribute and scale our tech across the industry. So, yes, we have our own sales team, but this funding will help us build out those partnerships a lot more. We’re an enterprise software company and our industry partnerships are absolutely key in growing our business.

GJ: Will your market position be defensible 10 and 20 years in the future?

MJ:  Absolutely. I believe we have the people, the data, and the tech to lead from the front for years to come.

GJ: Finally, have you identified a unique opportunity that others don’t see?

MJ: Well, we work very closely with our key customers to ensure we stay ahead of the curve and identify any new opportunities in the market. Others are starting to see the huge opportunities this technology can offer, but our relationships – along with our data, talent and team – give us a huge head-start.

More AI & Marketing Resources:

Image Credits

In-Post Image #1: Affectiva
In-Post Image #2: Realeyes

#