Skip to Content

Why body language is the next frontier for robots

The logo of IBM is seen at their booth pThe logo of IBM is seen at their booth p

It’s not everyday that a robot takes the stage at a technology conference and cracks jokes like a real wise guy.

On Wednesday, at an annual conference focused on robotics for businesses, Robert High, chief technology officer of IBM’s Watson supercomputing division, shared the stage with a little white robot. He used the occasion to explain the evolution of robotic technology and how businesses outside of manufacturing can use robots.

Although robotic technology has been around for decades, advances in artificial intelligence technologies are paving the way for robots to do more than merely attach doors to cars on assembly lines. Computers can be fed huge amount of data to learn from the information, make decisions, and explain the rationale behind those decisions.

IBM’s Watson is an example, High said, based on its ability to learn and make choices to help it win Jeopardy in 2011.

IBM (IBM) has since been experimenting with embedding so-called cognitive computing systems into robots, said High. As an example, High showed off a robot that looked like a small astronaut and came packed with Watson technology.

A robot on stage

High’s talked about the importance of non-verbal communications like body language that help punctuate the meaning of words. While High was speaking, the robot suddenly interrupted him and said with a pleasant, female voice, “You mean this?” while flailing it’s arms back and forth. The robot understood what High was talking about and responded with an appropriate gesture that caused the audience to chuckle.

High then asked the robot to recommend a meal containing avocados. The robot acknowledged the request and asked High what style of food he would like, to which High replied that he would like something Southwestern.

“And no onions, right?” asked the robot, as it remembered that High doesn’t like onions in his meals.

The robot then presented High with a recipe for a Southwestern avocado panini, and when High asked where he could buy the ingredients, the robot gave him directions to a nearby Whole Foods. Each time the robot said to either take a left or right turn, it lifted its appropriate arm to match the word.

It’s this sort of conversation between man and machine that High envisions will be the norm in the future. Robots that can accurately understand what a person is asking, incorporate body gestures to emphasize their own speech, and remember a person’s preferences will help create a better emotional bond between the man and machine.

A robot receptionist might one day greet you at a hotel and talk to you in a friendly manner while checking you in and giving you directions to your room, for example.

“To be effective, robots need to create an emotional connection,” said High. If humans are to interact with robots through speech and gesture, it’s got to be a pleasant experience. If it’s not, people are less likely to want to interact with them.

Of course, this was just a conference demonstration, and we are probably still years away from seeing the type of robot that High showed off on stage. But he believes that advances in computing will make it possible in the next five years.

Subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.

For more on IBM, check out the following Fortune video: