Video: Facial robot “Emo” shows its ability to synchronize facial expressions with a human partner. © Yuhang Hu
Winning robo-facial expressions through fine motor skills and artificial intelligence: Researchers have taught their facial robot “Emo” to predict a human smile based on subtle signs in order to then generate it itself, if possible at the same time. As they explain, high synchrony in facial expressions plays an important role in people’s sense of connection. The concept could therefore make a significant contribution to making the technical “beings” more likeable, say the developers.
They can move cleverly, look human and are equipped with “minds” and language skills: In recent years, increasingly powerful robots with humanoid features have emerged. But in one area development is lagging behind: the robots often still seem strangely technical. An important reason for this is the missing or inadequate facial expressions, which play an important role in establishing emotional connections for humans. Robots have already been developed that can mirror the emotional facial expressions of their counterparts. But due to unnatural delays, this ability often appears more ghostly than sympathetic.
As studies of human nonverbal communication suggest, facial synchrony between two partners plays an important role in the perception of connectedness: “Social alignment behaviors, such as simultaneous smiling, are important for successful social interactions because they indicate mutual understanding and shared emotions “Write the researchers led by Yuhang Hu from Columbia University in New York. Building on their previous developments, they now present a prototype of a facial robot that can establish social synchrony with a human partner.
A learning mind
“Emo” is a human-like head with flexible silicone skin. It is equipped with magnetic attachment points with an underlying system of 26 actuators. These produce fine motor movements that, when transferred to the silicone skin, can produce a wide range of nuanced facial expressions. There are high-resolution cameras in Emo’s “eyes” that can make precise eye contact with a human counterpart. You also record the signals of your partner’s non-verbal communication.
The “mind” of the facial robot is formed by built-in computer units equipped with artificial intelligence. These are so-called neural networks that have their own learning ability. First, the developers taught the robot to produce certain facial expressions. They put him in front of a camera and had him make random movements. The recordings were fed back to the AI as feedback. After a few hours, the robot learned the connection between its facial expression and the motor commands – similar to how people practice facial expressions in front of a mirror. The team calls this process self-modeling.
Successfully trained to recognize subtle signs
The robot was then presented with videos of human facial movements as part of social interaction. He was able to learn which facial details lead to certain facial expressions. As it turned out, after several hours of training, he was actually able to recognize a fraction of a second before he smiled that the other person was about to make that facial expression. Accordingly, Emo was able to use his facial fine motor skills very early on to produce a synchronized smile. “The robot is capable of sharing non-verbal signals with a human in real time. This not only improves the quality of interaction, but also promotes the building of trust between humans and machines,” says Hu.
The researchers now want to devote themselves to further developing and expanding the concept. For example, they plan to equip Emo with verbal communication skills as well by integrating systems like ChatGPT. The possible applications for such highly developed humanoid robots appear to be diverse, the developers emphasize. “By creating robots that can accurately interpret and mimic human expressions, we move closer to a future in which they can seamlessly integrate into our daily lives. “This could allow robots to provide us with companionship, support and even empathy,” says senior author Hod Lipson from Columbia University.
However, some people may also think of the risks that could be associated with this mechanization of people’s social environment. This aspect apparently also plays a role for Lipson: “Although the increasingly human-like systems enable a wealth of positive applications, developers and users also have a duty to consider ethical aspects,” says the scientist.
Source: Columbia University School of Engineering and Applied Science, specialist article: Science Robotics, doi: 10.5061/dryad.gxd2547t7