Can you be sure that the person talking to you is not 100% a robot? Soon you may not be so sure.
For the first time, scientists have built a robot that can move its mouth exactly like a human. This means that it avoids the so-calledmysterious valley“an effect where a robot’s actions appear unsettling because they’re uncomfortably close to natural—but don’t quite reach that threshold.
Researchers at Columbia University achieved this by allowing their EMO robot to study itself in a mirror. It found out how his flexible face and silicone lips would move in response to the precise actions of his 26 facial motors, each capable of moving up to 10 degrees of freedom.
They outlined their methods in a study published Jan. 14 in the journal Scientific robotics.
How EMO learned to move its face like a human
EMO uses an artificial intelligence (AI) system called a vision-to-action (VLA) language model, which means it can learn to translate what it sees into coordinated physical movements without pre-defined rules. During training, the humanoid robot made thousands of seemingly random expressions and lip movements while staring at its own reflection in the mirror.
Then the researchers sat the EMOs in front of hours of YouTube videos showing people speaking different languages and singing. This allowed him to connect his knowledge of how his motors produced facial movements with the corresponding sounds, all without any understanding of what was being said. Eventually, EMO was able to capture spoken audio in 10 different languages and lip-sync almost perfectly.
“We had particular difficulty with hard sounds like ‘B’ and sounds involving pursed lips like ‘W,'” Throw Lipsonengineering professor and director of Columbia’s Creative Machines Lab, said va declaration. “But these abilities will likely improve with time and practice.
Many roboticists have tried and failed to create a convincing humanoid, so it needed to be tested in front of real people before revealing EMO to the world. The researchers then showed videos of the talking robot using the VLA model and two other approaches to controlling its mouth to 1,300 volunteers – along with a reference video demonstrating ideal lip movement.
Two other approaches were amplitude baseline, in which EMO moved her lips based on sound volume, and nearest-neighbor landmark baseline, in which she imitated facial movements she saw others make that produced similar sounds. The volunteers were instructed to select the clip that best matched the ideal lip movement and chose the VLA 62.46% of the time—compared to 23.15% and 14.38% for the amplitude and nearest-neighbor baselines, respectively.
Robot nurses will require friendly faces
The researchers believe that overlooking the meaning of the face is one of the reasons why other projects have failed to create convincing robots.
“A lot of humanoid robotics today is focused on leg and arm movement, for activities like walking and grasping,” Lipson said. “But facial affection is just as important for any robotics application involving human interaction.”
As AI technology continues proceed at a dizzying paceRobots are expected to take on an increasing number of roles that require direct interaction with humans, including education, medicines and care for the elderly. This means that their effectiveness will be correlated with how well they can match human facial expressions.
“Robots with this ability will obviously have a much better ability to connect with humans because such a significant part of our communication involves facial body language and the whole channel is still untapped,” said the study’s lead author. Yuhang Huin a press release.
But his team is not the only one working to make humanoid robots more alive. In October 2025, the Chinese company released a video with eerily realistic robot headcreated as part of their efforts to make interactions between humans and robots feel more natural. A year before that, the Japanese team revealed artificial self-healing leather which could make robot faces look human.

Leave a Reply