title    
  Date
Search   
  NEWS SPORTS OPINION ARTS BLOGS MULTIMEDIA FEATURES WEEKEND INSIDE
A walk through the uncanny valley
Erika Biga Lee | | 6/1/2006
Standing before a room of middle and high school students in Bloomington for the National Science Olympiad tournament two weeks ago, Karl MacDorman, a new associate professor for Informatics at IU-Purdue Univeristy at Indianapolis, played a video of an android head. The disembodied head's eyes rotate and its mouth chews as the students watching squirmed uncomfortably.

"What's your impression?" MacDorman asked the students.

"Eeeeew! Creeepy!" they said.

This eerie phenomenon, MacDorman said, is called "the uncanny valley."

As more humanlike robots are built, people respond positively because the robots become more and more familiar, he said, as humanlike characteristics stand out. But because of our "strong expectations of what is human," MacDorman said, if the robot becomes "almost human," the response quickly turns negative and we notice the non-human characteristics more clearly.

"If a robot walks around with knees bent like this," MacDorman said while demonstrating the position, "you think, well this is just a robot. But if something that looks like a human being does it, you think that's odd, that's unnatural."

Many theories attempt to explain why the effect occurs. For example, MacDorman said, if you see a tree with blight, you don't say "eeeew," but if you see a human being with leprosy, it reminds you of a disease and triggers a negative response. Also, he said some people believe human-like androids resemble corpses.

"It could be that (very human-like androids) are a reminder of our mortality," MacDorman said. "(It can be) a reminder that we're all going to die, and it can be quite disturbing for that reason."

Robotics researcher Masahiro Mori first described the "uncanny valley" effect in 1970. Now that androids, as well as computer-generated characters in video games and movies, are becoming more realistic, it is a topic not only for discussion but also for policy for 3-D animation studios like Pixar.

"Part of the success of 'The Incredibles' was that Pixar made a conscious choice to give the characters a cartoonish style," said Don Strawser, who teaches T160, the History and Social Impact of Video Games, at IU .

The "uncanny valley" effect is also a problem in video games, Strawser said, because game makers try to make human characters as realistic as possible without pushing it so far that the users will reject it.

"This is really hard because you would have to animate every human gesture and it's just not practical," he said. "Yet we don't want our cute and cuddly cartoon world to turn into a soulless, zombie land."

Companies like Pixar might have a policy about how to avoid uncanniness, but the effect has not been widely tested experimentally, MacDorman said. That's where his research comes in.

"Android science is the idea that very human-like androids can be used in experiments to find out what kinds of behavior are perceived as human," MacDorman said.

Evolution certainly plays a role in our ability to determine humanness, he said, and "we are very sensitive to what is human," particularly in facial and body proportions.

For example, a one-millimeter adjustment in eye width can mean the difference between seeing a person as ugly or beautiful, he said. Other studies show that men prefer women with a two to three ratio of waist to hip -- and women who have that ratio tend to be the most fertile.

However, we are probably not as sensitive to humanoid robots, he said. Though humanoid robots may have a head, arms, legs or even eyes and mouth, only androids have human details like skin, eyelashes and teeth.

"So, does appearance matter?" MacDorman asked as he showed the students a video of a robot moving with and without skin.

"People have more sympathy for an android than they do for a mechanical-looking robot ... It reminds you of yourself, your personal identity, but at the same time it's a machine," he said. "People will turn off characters they see on the T.V., but not want to turn off the lights in a room with an android."

He and his colleagues have also shown that realism is due to "non-conscious movements" -- the breathing patterns and miniscule adjustments people make without realizing it.

"Even when a person is doing absolutely nothing, they're never perfectly still," he said. "In fact, if you held yourself perfectly still you couldn't see -- everything would go optic-violet. People who are very good at meditating have discovered that. Our whole visual system depends on slight movements of the body."

To see if people could be fooled into believing an android was human, MacDorman's research team placed a seated android nine feet away, drew a curtain back for two seconds, then asked participants what they saw.

Only 23 percent of people were fooled by the android, and these tended to be older people, MacDorman said. But when the same experiment was run with the android making "non-conscious movements," 70 percent believed the android was human.

"In a two second period, most people were fooled ... but at least it shows the importance of movement in creating the impression of presence," MacDorman said.

The fact that older people have more difficulty telling android from human was a problem for his team when they presented a new android at the 2005 World Expo in Aichi, Japan.

"Some older people, maybe 70 or 80 years old, were asking where is it? Where's the android? Because they thought it was a person," he said.

And extending that two-second period for the 70 percent of people who were fooled is extremely difficult, MacDorman said, partially because it is difficult to design and create smooth movement.

"The motion needs to look humanlike, but the joint is not the same and not human underneath," MacDorman said.

Whereas humans have 244 degrees of movement in their arms, current android prototypes only have 43 degrees. Using air actuators in the joints creates some of the best motion, but requires a compressor and pump connected in another room in order to control the android.

It's also a matter of cost. A few companies now make androids with realistic skin and motors with smoother movements and facial expressions for around $400,000, but initial prototypes cost around $1 million to make.

Another experimental use for androids is to study the nuances of human interaction. Making the robot appear more human not only gives it a physical presence, but also a social context, he said. In a recent experiment on gaze, MacDorman studied how people respond with their eyes in conversation when they believe the android is autonomous.

Whereas people in Europe and North America mostly look up and to the right when thinking about what to say to another person, people in Japan usually gaze downward.

"By world standards, the Japanese make a lot of eye contact, about 30 to 40 percent of the time, although this number is low compared to Europe and Asia," he said.

If the person in the experiment believed the android to be self-controlled, the person averted their eyes less often and made more eye contact with the android. If the person in the experiment thought the android was human-controlled, the subject responded more naturally by looking downward while thinking of a response.

"This is the first time breaking of gaze has been found to be related to what you believe is on the other end -- human or robot," MacDorman said.

Copyright © 2007 Indiana Daily Student   |   Privacy policy   |   Contact us   |   Feedback