Computers are good at tackling relatively straightforward tasks like crunching numbers and retrieving search results. But when it comes to deciphering the complexities of human emotions, well, they’re still a work in progress.
Rana el Kaliouby, cofounder and CEO of the startup Affectiva, is trying to solve that problem. Her company is working to teach technology to accurately read facial expressions, gestures, and tone of voice with the goal of making computers “emotionally intelligent.”
El Kaliouby doesn’t want to make computers emotional. Instead, her hope is to make computers understand emotions in real time—whether someone is laughing or angry, for example—so that the information can be used to make decisions.
Ad companies could better understand whether consumers engage emotionally with marketing pitches. Hiring managers could more easily screen applicants for customer-facing positions by analyzing their behavior (better candidates tend to show a wider range of emotions, the theory goes). Meanwhile, mental health professionals could use technology to track the facial and vocal cue of patients to better gauge whether they’re depressed, and if so, how much.
For more from el Kaliouby, watch the interview above.