Empathy is one of the most important leadership traits for managers. It helps build trust and connection among teams and demonstrates a leader’s ability to understand the needs of employees. But as managers find their roles assailed by tech leaders and face the ubiquitous fear of A.I.-related job loss, they can take comfort in the fact that artificial intelligence can’t effectively replicate empathy, experts say.
When asked if ChatGPT can empathize, Maryville University communications professor Dustin York is unequivocal in his response: “100 percent, no. It’s not a sentient thing like sci-fi movies would have you think.”
While ChatGPT can easily perform rote tasks like writing job descriptions or summarizing meeting notes, it’s unlikely to make for a one-to-one substitute for managers, a group many companies have been targeting for layoffs in recent months. That’s because the role of a manager is uniquely human and requires core soft skills like collaboration, communication, and team-building, which are difficult, if not impossible, to teach generative A.I.
“ChatGPT can take a large data set and come out with a confident-sounding answer to a lot of different things,” York says. “It’ll spit out whatever you put into it. But does it have feelings? Can it feel empathy? Not anytime soon.”
A manager who asks ChatGPT to elicit an empathetic reply will likely receive one. But that’s only because the technology responds to a prompt. It doesn’t have inherent traits like compassion or emotional intelligence, which managers use to foster loyalty and drive their teams toward a collective goal.
“It takes a human leader to fill in those blanks,” York says.
ChatGPT’s limitations in displaying empathy stem, in part, from the fact that it’s a text-based platform. Lacking a physical form, ChatGPT obviously can’t span the full breadth of human communication, using speech, facial expressions, and hand gestures, all of which are factors in communicating and registering feelings, says Julia Hirschberg, a Columbia computer science professor who studies empathetic speech in artificial intelligence. ChatGPT is, despite all its technological novelty, just a very sophisticated search engine. And for all its deserved promise, at the moment it’s still just text on a screen.
The technology also can’t pick up on the social cues that determine what emotional response a given situation demands. When a colleague is distraught over a death in the family, it can be obvious from the pain in their voice that they need emotional support, a comforting word, or a tender gesture. And even if ChatGPT could pick up on those context clues, it can’t deliver a sufficiently empathetic response via text alone. According to Hirschberg, people tend to speak slowly, in a softer voice, and with more frequent pauses when expressing empathy.
Since the medium is the message, ChatGPT is effectively limited in both because it can only produce text responses.
“Even when you give ChatGPT the best data set in the world, even after you edit what it gives you, your ceiling will still be at an email level,” York says. “And every person that works in the professional world knows you can’t get the same level of empathy through an email as in person.”
Soft skills, he predicts, will continue to reign supreme, and they can’t be duplicated by A.I.—for now.
Learn how to navigate and strengthen trust in your business with The Trust Factor, a weekly newsletter examining what leaders need to succeed. Sign up here.