Androids have mastered the art of being both fascinating and the stuff of nightmares at the same time. Seeing a human-like android move in person often stirs up an eerie discomfort in our chests.
Sure, it’s a marvel of technology, designed to mimic human expressions such as smiling and frowning, but there isn’t any genuine feeling behind those gestures, which creates a sense of unease.
As humans, we are experts at picking up subtle cues in facial expressions and body language—it’s just the way our brains are wired.
So, when a robot smiles without any real emotion or seems just a bit too stiff and mechanical, our minds cannot compute this distorted version of ourselves.
The inconsistency between an android’s lifelike appearance and lack of emotional depth brings about the uncanny valley effect.
It’s so unsettling because it poses as something that’s almost human yet behaves in a way that makes its artificiality stand out.
Until now, a “patchwork method” has been used to allow robots to move many parts of their face and exhibit facial expressions for an extended period of time.
The method involves preparing several action scenarios to avoid unnatural facial movements, with the flexibility to switch between these scenarios as needed.
However, this comes with a certain set of challenges, such as preparing complex action scenarios in advance, reducing noticeable unnatural movements during transitions, and adjusting movements to control facial expressions.
Sign up for Chip Chick’s newsletter and get stories like this delivered to your inbox.
In a new study, a team of researchers from Osaka University developed a technology using “waveform movements,” which represents gestures like breathing, blinking, and yawning.
It helps produce complex facial movements in real time and will hopefully make robots more realistic. The approach gets rid of the need to prepare complex and diverse action data while also avoiding obvious movement transitions. The introduction of waveform movements will allow variations in facial expressions to be instantly reflected.
“Advancing this research in dynamic facial expression synthesis will enable robots capable of complex facial movements to exhibit more lively expressions and convey mood changes that respond to their surrounding circumstances, including interactions with humans,” said Koichi Osuka, the senior author of the study.
“This could greatly enrich emotional communication between humans and robots.”
“Rather than creating superficial movements, further development of a system in which internal emotions are reflected in every detail of an android’s actions could lead to the creation of androids perceived as having a heart,” added Hisashi Ishihara, the lead author of the study.
Overall, the technology’s advanced ability to adjust and express emotions will significantly boost the value of communication robots. They are expected to exchange information with humans more naturally.
The details of the full study were published in the Journal of Robotics and Mechatronics.