Last fall, as people around the world stared into their computer screens, the British Broadcasting Corporation was staring right back.
The company was checking out the expressions on people’s faces as they read advertisements online. Using webcams, the company scanned 5,153 study participants’ for happiness, sadness, fear, surprise, puzzlement, and rejection of online ads.
Why bother to get the cameras involved? Marketers want to get a more accurate picture for how advertising really makes people feel. Instead of just asking focus groups how they feel, or what they think, they’re going straight to the subconscious source, and watching faces react in real time.
Of course, our human emotions are more complicated than a few smiles and frowns. Andrew Tenzer, senior research manager at BBC Global News says he knows that. A face of “puzzlement” could mean something is intriguing, or it might just be an indication that a person is deep in thought. But Tenzer also says unlike traditional survey and questionnaire results, “our face never lies.”
Using the six basic emotions to track expressions on people’s faces, the BBC marketing team found a few key ways to make text-based online ads more successful. The new research, published online in January, found, not surprisingly, that people don’t like to be tricked into reading ads that look like a news article, for example. But participant faces registered much less “rejection” when those same ads were clearly labeled.
In the past, the BBC has used its facial recognition technology, built by in-house startup CrowdEmotion, to track 4,500 faces as people watched show trailers to see what kinds of emotions the commercials produced. They’ve also looked at how hundreds of study participants react to programs like Top Gear and Sherlock.
Get Data Sheet, Fortune’s technology newsletter.
“Using CrowdEmotion’s emotional intelligence platform, broadcasters can find out exactly what an audience thought about a programme instead of being fed skewed data by respondents trying to give the right answer,” CrowdEmotion wrote in a release.
Facial recognition software like CrowdEmotion’s has gained traction in recent years, and is often used for identifying people in photos. Facebook uses “DeepFace” to pin point who’s who in images so it can predict who you might want to tag. The State Department uses the tech to catalogue faces of people registering for visas and passports into searchable databases. Scientists will soon use their own version of facial recognition to help save whales by more quickly identifying them when they’re tangled up in fishing gear.
Now a few U.S. companies are starting to experiment with using facial recognition to track reactions of American TV viewers to commercials and TV shows. TVision, a company that measures “eyes-on-screen” is in about 600 households in the Boston area. Last weekend, the startup awarded the “most smiled at” ad of Super Bowl 50 to the Amazon Echo commercial. It’s the one in which Alec Baldwin parties with Dan Marino and Missy Elliott, while ordering his Echo voice assistant, Alexa, to do things around the house like turn on the lights and answer questions about Marino’s failed Super Bowl record.
Tim Cook tears into personal data-based advertising:
TV ratings giant Nielsen is also buying up facial recognition technology. Last year, it acquired Innerscope Research, a “neuromarketing” company that tracks emotions using skin sweat, eye movements, and other biometrics to find out exactly where audiences are looking on screen and how they feel. But so far Nielsen hasn’t used the new approaches in any of its industry-standard TV ratings.
There is a certain creep factor that goes along with monitoring what happens right in your living room. Samsung learned that the hard way last year when it warned users of its Smart TVs to watch what they say in conversation around their sets. That could be part of the reason why there are no prizes for the “most feared” or “happiest” show on television. At least not yet.