I compute your pain

Emotion-sensing face-recognition software could transform marketing.
Karl D. Stephan | Jul 27 2015 | comment  



My wife usually knows when I'm upset long before I do.  I haven't performed a scientific study to determine how she does this.  She says she reads my body language, the tone of my voice, and my facial expressions, as well as what I say. 

Women seem to have a built-in advantage when it comes to sensing the emotional states of others, so it's not a surprise that the co-founders of a company that sells software to read emotions were two women:  Rosalind Picard and Rana el Kaliouby. 

The history of why they began their research into getting computers to sense emotions and what their company is doing now may tell us something about the ethical challenges to come if companies begin using emotion-reading software on a large scale.  A recent article in The New Yorker profiles these women and their work.

Back in the 1990s, almost no one in computer science was thinking professionally about emotions.  One of the few exceptions was Rosalind Picard, who has been on the MIT faculty since 1991.  She realized that computers could serve people better if they had a clue as to what emotional state a person was in.  Despite confused stares and even active discouragement from computer-science colleagues, she persisted in researching what she termed "affective computing" and wound up establishing an entirely new field.

Rana el Kaliouby entered the fray by a similar route.  Her first idea of a practical application of affective computing was to develop a kind of emotional hearing aid for autistic people, whose disability usually prevents them from inferring the emotional state of people around them.  She wound up teaming with Picard on some academic research projects, which attracted so much attention that they decided to spin off a company called Affectiva in 2009.

Once their ideas left academia for the commercial world, the tone of things changed.  Every second, thousands of marketers are competing for online attention—in TV ads, YouTube video ads, smartphone apps, and all the other electronic attention-grabbers we surround ourselves with these days.  Someone has even calculated what the attention of the average American is worth these days:  about six cents per minute in 2010. 

Your attention is the coin of the realm that you exchange for "free" internet services, and companies who sell these services would dearly like to know how you feel about what you see.  This is what Affdex, the software offered by Affectiva, is supposed to do.

It works by monitoring facial expressions in a sophisticated way that uses fixed points (eg, the tip of the nose) as references for the movement of eyebrows, the corners of your mouth, and other features that have been proven to be emotionally expressive.  The result is a readout of four emotional dimensions:  happy, confused, surprised, and disgusted.  I expect most marketers try to get the biggest happy readings, maybe laced with a little surprise here and there, and try to lower the confused and disgusted numbers.  Anyway, lots of companies are willing to pay lots of money to get these numbers.

So far, the main use has been in focus groups and other controlled settings where the consent of the consumer has been obtained to use video of their faces.  You can imagine the day when those little built-in cameras in computers and smart phones will be activated for emotion-reading software, possibly without your knowledge.  That is a thin red line which, to my own knowledge, has not been crossed yet. 

But I can easily picture a situation in which a browser turns on your camera to watch your face, maybe in exchange for some bonus or free this or that, and somehow it just never gets turned off again.

Like most technology, affective computing can be used for either good or not-so-good purposes.  If software developers could learn how to sense a user's emotions, it could make software a lot easier to use.  I can think of many times when I was trying to do something with new software and got frustrated or confused.  Software that could sense this and trot out simpler and simpler explanations and help files, and even ask questions ("Just what are you trying to do?") would be a tremendous advance over that peculiar sense of helplessness I get when I face a zillion menu options and know one of them will do what I want, only I don't have twelve hours to spare in order to try each one.

On the other hand, both Picard and Kaliouby realize that this sort of software could be abused.  Picard left Affectiva in 2013, and although Kaliouby is still with the firm, she expresses some disappointment that the commercial applications of Affdex have overshadowed the assistive applications for autism sufferers and other disabled people. 

If one tries to come up with a worst-case scenario for how emotion-reading software could be abused, some sort of subliminal manipulation comes to mind.  What if emotion-reading ads prove to be well-nigh irresistible?

Years ago, there was a flap of concern that advertisers were inserting single-frame images in TV ads that went by so quick your conscious mind didn't even notice them.  But supposedly, they went straight to your subconscious and made you go out and buy a Coke you didn't need, or something like that.  So-called subliminal advertising has proven to be useless, but we have yet to see how effective advertising is when it's coupled to software that can read the viewer's emotional state and make changes in its presentation in real time in response.  Of course, a good salesman does this instinctively, but up to now Internet advertising has been open-loop, with no way of knowing what the viewer felt about the ad.  Software such as Affdex promises to close that loop.

Let's hope that affective software leads to a kinder, gentler interaction with the machines that take up an increasing part of our lives, without taking us down a road that amounts to secret manipulation of consumers without their knowledge or consent.

Karl D. Stephan is a professor of electrical engineering at Texas State University in San Marcos, Texas. This article has been republished, with permission, from his blog, Engineering Ethics, which is a MercatorNet partner site.



This article is published by Karl D. Stephan and MercatorNet.com under a Creative Commons licence. You may republish it or translate it free of charge with attribution for non-commercial purposes following these guidelines. If you teach at a university we ask that your department make a donation. Commercial media must contact us for permission and fees. Some articles on this site are published under different terms.

comments powered by Disqus
Follow MercatorNet
Facebook
Twitter
MercatorNet RSS feed
subscribe to newsletter
Sections and Blogs
Harambee
PopCorn
Conjugality
Careful!
Family Edge
Sheila Reports
Reading Matters
Demography Is Destiny
Bioedge
Conniptions
Connecting
Above
Vent
From the Editor
Information
contact us
our ideals
our People
our contributors
Mercator who?
partner sites
audited accounts
donate
advice for writers
privacy policy
New Media Foundation
L1 488 Botany Rd
Alexandria NSW 2015
Australia

editor@mercatornet.com
+61 2 8005 8605
skype: mercatornet

© New Media Foundation