With face-reading software, a computer's webcam might spot the confused expression of an online student and provide extra tutoring. Or computer-based games with built-in cameras could register how people are reacting to each move in the game and ramp up the pace if they seem bored. But the rapidly developing technology is far from infallible, and it raises many questions about privacy and surveillance.
Companies in this field include Affectiva, based in Waltham, Massachusetts, and Emotient, based in San Diego. Affectiva used webcams over two and a half years to accumulate and classify about 1.5 billion emotional reactions from people who gave permission to be recorded as they watched streaming video, says Rana el-Kaliouby, the company's co-founder.
The company strongly believes that people should give their consent to be filmed, and it will approve and control all of the apps that emerge from its algorithms, Kaliouby says. Face-reading technology may one day be paired with programs that have complementary ways of recognising emotion, such as software that analyses people's voices, says Paul Saffo, a technology forecaster.
For some, this type of technology raises an Orwellian specter. And Affectiva is aware that its face-reading software could stir privacy concerns. But Kaliouby says that none of the coming apps using its software could record video of people's faces. "The software uses its algorithms to read your expressions," she says, "but it doesn't store the frames."
So far, the company's algorithms have been used mainly to monitor people's expressions as a way to test ads, movie trailers and television shows in advance. Affectiva's clients include Unilever, Mars and Coca-Cola. The advertising research agency Millward Brown says it has used Affectiva's technology to test about 3,000 ads for clients.
Apps that can respond to facial cues may find wide use in education, gaming, medicine and advertising, says Winslow Burleson, an assistant professor of human-computer interaction at Arizona State University. People with autism, who can have a hard time reading facial expressions, may be among the beneficiaries, Burleson says.
But facial-coding technology raises privacy concerns as more and more of society's interactions are videotaped, says Ginger McCall, a lawyer and privacy advocate in Washington. "The unguarded expressions that flit across our faces aren't always the ones we want other people to readily identify," McCall says - for example, during a job interview.
She says the programs could be acceptable for some uses, such as dating services, as long as people agreed in advance to have webcams watch and analyse the emotions reflected in their faces. "But without consent," McCall says, "they are problematic - and this is a technology that could easily be implemented without a person's knowledge."
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
