Contradicting that software can detect emotions from the face, a new report has found that facial movements are an inexact gauge of a person's feelings, behaviour or intention.
Such software is being for a variety of purposes, including surveillance, hiring, clinical diagnosis, and market research.
"It is not possible to confidently infer happiness from a smile, anger from a scowl or sadness from a frown, as much of current technology tries to do when applying what are mistakenly believed to be the scientific facts," a group of scientists wrote in their comprehensive research review.
The report was published in the journal of 'Psychological Science in the Public Interest'.
The authors noted that the general public and some scientists believe that there are unique facial expressions that reliably indicate six emotion categories: anger, sadness, happiness, disgust, fear, and surprise.
But in reviewing more than 1,000 published findings of facial movements and emotions, they found that typical study designs don't capture the real-life differences in the way people convey and interpret emotions on faces.
A scowl or a smile can express more than one emotion depending on the situation, the individual or the culture, they said.
"People scowl when angry, on average, approximately 25 per cent of the time, but they move their faces in other meaningful ways when angry," explained Lisa Feldman Barrett, the author of the report.
"They might cry, or smile, or widen their eyes and gasp. And they also scowl when not angry, such as when they are concentrating or when they have a stomach ache. Similarly, most smiles don't imply that a person is happy, and most of the time people who are happy do something other than smile," added Barrett.
In a separate article in the journal, researchers noted that most scientists agree that facial expressions are meaningful, even if they don't follow a one-to-one match with six basic emotion categories.
They proposed a new model for studying emotion-related responses in all their complexity and variations. This approach would measure not only facial cues but also body movements, voice fluctuations, head movements and other indicators to capture such nuanced responses as smiles of embarrassment or sympathetic vocalisations, they said.
"We thought this was an especially important issue to address because of the way so-called 'facial expressions' are being used in industry, educational and medical settings, and in national security," said Barrett and her co-authors.
Disclaimer: No Business Standard Journalist was involved in creation of this content
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
