The Remarkable Effect Facial Recognition Can Have on Your CX

by Colin Shaw on March 4, 2019

We develop emotional intelligence as children once we understand how other people feel. Based on the behavior of a few of our fellow humans, some of us more than others.

Now, machines are developing emotional intelligence as well. The latest developments in this field are changing the way we can measure authentic customer emotions in real time.

We discussed how technology and facial recognition are changing how to measure authentic customer emotions in real time on our latest podcast. Our guest, Zhecho Dobrev is a senior consultant at Beyond Philosophy and he shared some of the latest developments in the field.

So, What Do We Mean by Authentic Emotion Measurement?

Authentic emotion measurement means capturing data on customers’ emotions as they occur in real time during your experience.

People researching customer emotions often do surveys where they ask the customers about how they felt during an experience. However, you’re asking them about an experience they had a week or two ago, or longer. That’s not measuring emotion; that’s measuring the recall of emotion, and the memory is a tricky thing.

As global Customer Experience consultants, the challenge for us has always been capturing authentic customer emotions. We’ve been measuring emotions since 2005 with surveys.  From time to time, we’ve had people challenged whether people can recall these emotions accurately.

It’s a fair question. While the truth is they do remember their emotions, the problem is they don’t remember all of them.

People tend to remember the peak emotion they felt, meaning the most intense moment of the experience, and how they felt at the end. It’s called the Peak-End Rule and it explains how we remember things.

Customer emotions drive customer behavior. Therefore, measuring those emotions in real time is essential.

So, how do you tell what the person is really feeling rather than what they say they’re feeling? You can put people in MRI scanners. With an MRI, you can see their brain patterns and everything else. However, they’re bloody huge machines, so they do not score high in practicality.

There is a more practical technology. It turns out that measuring authentic customer emotions starts with facial recognition.

And Where Does Facial Recognition Fit into it All?

Facial recognition can detect changes in your face and match you to an identity. It is getting attention recently, which makes it seem new, but the technology has been in development since the 1970s. People began to develop facial action coding systems which could identify basic actions of your face and identify expressions.

Many of you probably relate facial recognition to how you can unlock your phone. However, the technology has many more applications than streamlining your access to Candy Crush Saga. In China, you can pay with your face at a KFC. In the U.S., you can now check in at Delta with your face at some airports like Atlanta and Detroit.

People are also using this technology in security. In India, the Delhi Police were able to track 3,000 missing children in just four days and reunite them with their family. In China, authorities were able to track a mentally ill man and return him to his brother using his picture.

However, some of the technology companies have started to look beyond the facial recognition aspect to facial emotion recognition. You can use the technology not only to recognize the face, but also to recognize how people feel without asking them—and in real time.

Micro Expressions Have Macro Changes in Store for Customer Experience

There are two ways of interpreting or signaling emotions. There’s the overt emotion that somebody puts on their face. They want you to know they are happy, so they smile, and their eyes crinkle up at the edges.

However, there is another way to signal emotions. They are called micro experessions and for Customer Experience, it opens a whole lot of huge opportunities.

Micro expressions display the emotions we feel at an unconscious level. Sometimes, they reveal emotions we don’t necessarily want to share.

From a widening of the eyes when we are surprised to a tightening of the mouth when we are angry, we all have emotional “tells.” These micro expressions are the reason gamblers on ESPN wear balaclavas and sunglasses to hide these tells from the other players.

Emotion is a combination of a number of micro expressions. The Facial Recognition software data would tell you the main emotion, and then it would supply the percentages of other emotions. For example, you might get data that a customer is surprised at around 70 percent, but the other 30 percent is anger.

Using facial recognition technology, you can identify how the customer is feeling walking into that experience. You can also identify how the customer is feeling during that interaction with the employee and how they felt when they leave. From a practical perspective, you can see the effect of your Customer Experience design, the effects of the customer service training you engaged or whatever else you want to measure that relates to how the customer feels.

In a digital experience, you can record a customer having the existing experience, like booking an appointment online or buying a product to see how they feel throughout. When combined with eye tracking, it can not only tell you customers feel emotion in a certain place but also the cause of that emotion.

Bonus: An Old Technology with a Whole New Use

Some people feel emotions but do not show it. A technology that has also been around for many years called Galvanic Skin Response (GSR) can measure the magnitude of people’s emotions based on a reading by a device used to measure the electrical resistance caused by perspiration on people’s hands.

What’s interesting about the GSR is it’s about the skin. When you are in a state of arousal like when you’re feeling something more intensely, your body reacts by releasing sweat. The perspiration changes the ability of your skin to conduct electricity. The device runs a low amount of electricity across your skin to measure the difference in electric resistance and the change correlates with the intensity of emotion.

Obviously, in many settings, GSR is as impractical as an MRI reading.  That said, matching GSR with the facial recognition data about the emotion and the intensity could be really interesting.

Learning More about This Technology Can Make a Big Difference in Your CX

Customers have emotions because they’re people and therefore you want to measure that. The challenge has been it is self-reporting and therefore, there exists inherent dangers with it. With facial recognition technology, you can have facts, a read of authentic customer emotions as they occur in your Customer Experience.

We’re at the infancy of this technology, but it’s going to be mainstream. Customer Experience professionals would be wise to think about how this technology is going to take hold. For me, it’s about getting ahead of that curve.

Zhecho Dobrev and I will host a webinar on Facial recognition for measuring customer emotions in real time  on March 13, 2019, at 11 am Eastern. Register here to join us.



Hear the rest of the conversation on How to Measure Authentic Customer Emotions in Real Time on The Intuitive Customer Podcast. These informative podcasts are designed to expand on the psychological ideas behind understanding customer behavior. To listen in, please click here.



If you enjoyed this post, you might be interested in the following blogs and podcasts:

How Do Customers Decide If Their Experience is Good or Bad? [Podcast]

How We Make Decisions—Prospect Theory

Why Customers Make Strange Decisions


Colin Shaw is the founder and CEO of Beyond Philosophy, one of the world’s leading Customer experience consultancy & training organizations. Colin is an international author of six bestselling books and an engaging keynote speaker.


Follow Colin Shaw on Twitter @ColinShaw_CX

Colin ShawThe Remarkable Effect Facial Recognition Can Have on Your CX