Skip to content

UK Watchdog Issues Warning Against Emotional Analysis Tech

A photo of a computer screen running

A photo of a computer screen running “Real Time Face Detector” software shows visitors’ expressions analyzeded and explained in real-time at the stand of the Fraunhofer Institute at the CeBIT trade fair in Hanover on March 6, 2008. The Real Time Face Detector is a software module that can be used for fast face detection in video streams and single pictures.
Photo: John MacDougall (Getty Images)

The head of the United Kingdom’s independent privacy watchdog worries highly hyped efforts to use AI to detect people’s emotional states simply may not work, not now, or possibly even ever.

In a first of its kind notice, The Information Commissioner’s Office, Britain’s top privacy watchdog, issued a searing warning to companies against using so-called “emotional analysis” tech, arguing it’s still “immature” and that the risks associated with it far outweigh any potential benefits.

“Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever,” ICO Deputy Commissioner Stephen Bonner wrote. “While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgments about a person that are inaccurate and lead to discrimination.”

Emotion analysis, also known as emotion recognition, or affect recognition, follows similar principles to more well known biometrics techniques like facial recognition, but is arguably even less reliable. Emotional analysis or emotional recognition systems take scans of individuals’ face expressions, voice tones, or other physical features, and then attempts to use those data points to infer mental states or predict how someone feels.

USC Annenberg Research Professor Kate Crawford details some of the inherent pitfalls of that approach in her 2021 book Atlas of AI.

“The difficulty in automating the connection between facial movements and basic emotional categories leads to the larger question of whether emotions can be adequately grouped into a small number of discrete categories at all,” Crawford writes. “There is the stubborn issue that facial expression attempts indicate little about our honest interior states, as anyone who has smiled without feeling truly happy can confirm.”

Bonner went on to say that “the only sustainable biometric deployments” are ones that are fully functional, accountable, and “backed by science.” Although the ICO has issued warnings about specific technologies in the past, including some falling under the category of biometrics, Bonner told The Guardian this week’s notice marks the first general warning against the ineffectiveness of an entire technology. In that article Bonner went on to describe attempts to use biometrics to detect emotion as, “pseudoscientific.”

“Unfortunately, these technologies don’t seem to be backed by science,” Bonner said The Guardian.

And while the ICO post spends some time calling out potential threats inherent to biometrics tech through the use of facial recognition for ID verification or airport check-ins, the watchdog maintains emotional analysis is uniquely worrisome.

“The inability of algorithms which are not sufficiently developed to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination.” The ICO post read.

.