Skip to main content

Bridging the uncanny valley between humans, robots

April 18, 2014 By Kelly April Tyrrell

Nao robot

A Nao robot serves “sushi” to attendees at the Human-Computer Interaction Laboratory open house. The robot chose from among several sushi options based on orders from the human visitors.

Photo: Kelly April Tyrrell

There might be a day in the not-so-distant future when, instead of cat photos and selfies, we humans are showing off our robots.

Researchers at the Wisconsin Human-Computer Interaction Laboratory (HCI Lab) at the University of Wisconsin–Madison are actually looking forward to that day, when robots help us with workouts at home and allow soldiers overseas to check in on their sleeping children.

To help us get there, UW–Madison scientists are working to make human-robot interaction a lot more natural. Recently, the HCI Lab hosted an open house to show off some of the progress it has made.

A small girl in a purple coat sat in front of a long table. A shiny white and grey robot, about the size of a doll, sat in front of her. Between them was a display of objects: a red pail, a clear glass, a black ice cream scooper.

“Is it a square shape?” she asked the robot, its eyes lighting up blue.

“No,” it replied in a flat, child-like voice.

“Is it silver?” she asked Nao, the humanoid robot made by the Parisian company, Aldebaran Robotics.

Nao repeated its response and the back-and-forth exchange continued until the girl guessed correctly which of the objects the robot had chosen. It congratulated her with the exuberance of an 8-year-old.

In the next room over, on the third floor of the Computer Sciences building, Casey Dezonia ordered “sushi” from a white and orange Nao stationed in front of a mock serving table.

Per Casey’s request, the robot located the sesame tofu – made from wooden blocks – and placed it on a plate in front of him.

“I want to know how the hands look so realistic,” Casey said while his father and computer sciences graduate student, Chien-Ming Huang, looked on. “Can you build a robot with the same number of fingers as humans?”

Every now and then, the computer program running the robot would crash and the student scientists would hurry to fix it.

“There are so many things going on here, sometimes the computer inside will crash,” Huang explained.

Normally, a person might wear special glasses that help human and robot share gaze, but on this day the glasses weren’t working consistently. And though Nao can be programmed to respond to verbal commands, the constant hum of conversation in the room sometimes confused the robot.

A computer sciences undergraduate, Jing Jing, instead played the wizard behind the curtain and helped Nao along.

Huang explained the purpose of the HCI Lab’s work.

“If a robot can behave ‘human-like,’ then people don’t have to learn a new way to interact,” he said, using as an example the learning curve many experience when they buy a new smartphone.

Researchers in the HCI Lab are also trying to bridge the uncanny valley, a reference to the feelings of creepiness we experience when we interact with a robot that seems almost, but not quite, human.

The field of human-computer interaction is relatively new and blends together computer science, design, behavioral science and more. It involves more than just building a machine that works.

For example, telepresence robots allow a person to be virtually present in a space they don’t physically occupy. Operating remotely in one location, they can move vicariously through the physical space of the robot; a virtual robot’s-eye view.

Irene Rae, a computer sciences graduate student under the HCI Lab’s director, Professor Bilge Mutlu, has found that factors like the height of the robot impacts how people interact with each other using the technology. She is working to understand how to better design these telepresence robots to accommodate users’ preferences.

Sometimes referred to as “Skype on a stick,” the HCI Lab’s telepresence robot consists of an iPad-like device running a video-conferencing program, attached to a motorized base with wheels.

These robots allow doctors to treat patients in distant hospitals, business collaborators to duck in and out of offices and more.

At the open house, visitors could use the robot to “see” and navigate a maze from a computer in another room.

Enjoy this story?

Read more news from the College of Engineering