When you smile at them, “EVA” replies the friendly expression: Researchers have developed a robot face that can capture and convincingly imitate the facial expressions of people close by using artificial intelligence and sophisticated fine motor skills. The concept is a further step towards the goal of making “technical beings” appear more human in order to make dealing with them more pleasant.
Joy, friendliness, curiosity …: Many emotions and messages are reflected in the complex facial expressions of people – facial expressions are an important part of our non-verbal communication. If it is missing, faces appear to us repellent and unconfidence. So far, this has largely been the case with the faces of robots – their mostly static facial replicas hardly make them appear personally. Against the background that these techno-beings are increasingly to become contact persons for people in different areas, there is a need for optimization, say the scientists of the Creative Machines Lab at Columbia University in New York. They have now presented their success in this area at the IEEE 2021 ICRA International Conference on Robotics and Automation in Xi’an, China.
“The idea for EVA came up a few years ago when my students and I noticed that we had the uncomfortable feeling that the robots were staring at us in the laboratory,” reports the head of the research group, Hod Lipson. He also noticed the need for more human features in robotic systems when making observations in grocery stores, where staff gave name tags to refill robots or put caps on them. “People seemed to want to humanize their technical colleagues in this way. That gave us the idea of building a robot that has a super expressive and responsive human face, ”says Lipson.
Fine motor skills and artificial intelligence
As he and his colleagues report, however, it turned out to be an extremely tricky task to create a convincing robot face that comes close to the human model. The first hurdle was to develop physical machinery with extremely powerful fine motor skills. Because our facial expressions are based on a complex interplay of more than 42 tiny muscles that attach to the skin and bones of the human face at various points. “It was also a particular challenge to design a system that fits into a human-sized head and is so powerful at the same time that it can generate a wide range of facial expressions,” says co-author Zanwar Faraj.
By using components from the 3D printer and sophisticated systems of pull cables and tiny motors, the developers finally succeeded in giving the imitation face, which is covered by a plastic membrane, expressions that increasingly resembled those of human facial expressions. “One day I caught myself smiling back reflexively when EVA gave me a phrase to match,” says Lipson. The robot’s face can now convey the six basic emotions anger, disgust, fear, joy, sadness, and surprise, as well as a number of more nuanced emotions, surprisingly well, the researchers report.
After they were satisfied with EVA’s mechanics, they turned to their second goal: programming the artificial intelligence that controls EVA’s facial movements. The robot face should be able to read the facial expressions of nearby human faces and then mirror them. The “brain” of EVA was equipped with so-called neural deep learning networks. As the researchers explain, this system had to ensure two capabilities: First, EVA should be able to learn to use its own complex system of mechanical muscles to create a particular facial expression. Second, it needed the ability to automatically capture human facial expressions in order to mirror them.
EVA learns to mirror facial expressions
To teach the system how one’s own face looks and reacts, the scientists filmed EVA for hours as it generated a series of random facial expressions. The internal neural network of the system then learned, like a person observing himself in the mirror, to combine muscle movements with the image data of his own face. This is how EVA got a feel for how her own face works. Then she was able to learn through the artificial intelligence to compare the self-image with recordings of human faces that were recorded by a video camera. So Eva finally developed the ability to grasp human facial expressions and react to them by imitating the respective facial expressions of the person.
To some this will seem even more eerie than a rigid facial expression – but it may just be a matter of getting used to. “Our brains seem to respond well to robots that have some sort of recognizable physical presence,” says Lipson. The researchers also point out that EVA has been an experimental system to date, and that performance is still far from the complex way people use facial expressions to communicate with one another. However, they are convinced that such technologies may one day prove to be useful in promoting a pleasant feeling of interaction with robotic systems. “Robots are woven into our lives in more and more ways, so it is becoming increasingly important to build trust between humans and machines,” says co-author Boyuan Chen.
Source: Columbia’s School of Engineering and Applied Science, presentation at the IEEE 2021 ICRA International Conference on Robotics and Automation