The empathy economy

Victoria Cleverby
3 min readJun 9, 2021

The empathy economy is a name for the economy based on the value created (either monetary or by businesses) by AI and other technological innovations that detect and simulate human emotions. Put shortly this means that there is a whole industry focused on making AI recognise human emotion and respond to it accordingly. And this industry is developing rapidly. A study published by Gartner in January 2019 stated that in no less than 4 years your parents will know less about how you feel than the devices you use.

One of the companies at the forefront of this technology is Affectiva, they use emotion AI — software that can understand nuanced human emotions and complex cognitive states based on facial and vocal expressions. One of their areas of application is ad testing. They measure customers’ unfiltered and unbiased emotional and cognitive responses, by analysing their facial movement. This enables organisations to understand how their customers and viewers feel when they can’t or won’t say so themselves.The insights could then be applied to improve brand experiences and communications. Another area they are operating in is within the automotive industry. Here they use the technology to understand complex and nuanced states of driver impairment such as levels of drowsiness, distraction and anger to ensure road safety. They also monitor the passengers state, and if it could be improved by adapting music, lighting, temperature.

We have also seen some patents around this lately. In 2018, Amazon filed a patent for a new system that detects the physical and emotional well-being of users based on their previous and current interactions. Its Rekognition suite (used by outside businesses and organisations) can now see fear in our faces, while Alexa can sense and understand frustration. And Spotify recently was granted a patent with technology that aims to use recordings of users’ speech and background noise to determine what kind of music to curate and recommend to them.

One of the big drawbacks with the technology is that it is prone to bias, this because of the subjective nature of emotions. It is often also not sophisticated enough to understand cultural differences in expressing and reading emotions, making it harder to draw accurate conclusions. Actually in 2019, the Association for Psychological Science conducted a review of the evidence, concluding that there is no scientific support for the common assumption that a person’s emotional state can be readily inferred from their facial movements.

there is no scientific support for the common assumption that a person’s emotional state can be readily inferred from their facial movements.

One example where it could be applied with not so great results is in the example of an elderly person using Affectivas automotive solution. Elderly people might be more likely to be wrongly identified as having driver fatigue (the older the age of the face, the less likely it is that expressions are accurately decoded). And as these systems become more commonplace, insurance companies are going to want a piece of the data. This could mean higher premiums for older people, as the data would suggest that, despite many prompts to rest, the driver pressed on. And how empathic is that?

--

--

Victoria Cleverby

Design strategist @Kivra, Enthusiastic trendspotter and wannabe futurist