Science News

Motion picture cameras to help androids make realistic facial expressions

By Brooks Hays   |   March 22, 2021 at 9:03 AM
Researchers used metallic dots and high-speed cameras to observe the differences in the movements and patterns of human and android facial expressions, finding flow lines as human faces move are much more curved than on androids. Photo by Ishihara, et al./Frontiers in Robotics and AI

March 22 (UPI) -- Making androids more lifelike requires more than just a bigger vocabulary and more demonstrative hand gestures -- facial expressions are an essential part of human communication.

To better gauge the expressiveness of android faces, scientists used motion capture cameras to compare the facial movements of humans and robots.

Advertising
Advertising

Scientists hope their analysis -- published Monday in the journal Frontiers in Robotics and AI -- will help robotic engineers design more expressive androids.

Facial expressions are powerful communication tools -- so powerful, in fact, previous studies have shown they allow human emotions to be recognized by dogs, horses and even bears.

So far, androids have struggled to replicate human facial expressions. Some researchers suspect awkward or poorly-timed expressions might explain why some people found lifelike robots creepy.

For the new study, scientists put reflective dots on the faces of five android faces and the faces of several human volunteers. Infrared cameras captured the movement of the dots as the androids and humans produced a variety of facial expressions.

The sensing technology was able to capture 120 frames per second and render the movements of the face as a three-dimensional displacement vectors.

"Advanced artificial systems can be difficult to design because the numerous components have complex interactions with each other," lead study author Hisashi Ishihara said in a press release.

"The appearance of an android face can experience surface deformations that are hard to control," said Ishihara, a researcher at Osaka University in Japan.

Deformations can be caused both by the awkward movements of mechanical actuators and by interactions between the outer layers of faux skin and the skull-shaped structure beneath.

The data showed patterns and movements of flow lines on the faces of androids and humans, especially near the eyes and on the forehead, were very different.

As the cameras revealed, the flow lines of androids were quite straight, while the flow lines of expressive human faces were much more curved.

Scientists also noted major differences in surface undulation patterns on the upper portions of human and android faces.

"Redesigning the face of androids so that the skin flow pattern resembles that of humans may reduce the discomfort induced by the androids and improve their emotional communication performance," said senior study author Minoru Asada.

"Future work may help give the android faces the same level of expressiveness as humans have. Each robot may even have its own individual 'personality' that will help people feel more comfortable," said Asada, a professor of adaptive machine systems at Osaka.