Crossing the Uncanny Valley: Breakthrough in technology for lifelike facial expressions in androids
en-GBde-DEes-ESfr-FR

Crossing the Uncanny Valley: Breakthrough in technology for lifelike facial expressions in androids

23.12.2024 Osaka University

A research group led by Osaka University has developed a technology that allows androids to dynamically express their mood states, such as “excited” or “sleepy,” by synthesizing facial movements as superimposed decaying waves.

Osaka, Japan – Even if an android's appearance is so realistic that it could be mistaken for a human in a photograph, watching it move in person can feel a bit unsettling. It can smile, frown, or display other various, familiar expressions, but finding a consistent emotional state behind those expressions can be difficult, leaving you unsure of what it is truly feeling and creating a sense of unease.

Until now, when allowing robots that can move many parts of their face, like androids, to display facial expressions for extended periods, a 'patchwork method' has been used. This method involves preparing multiple pre-arranged action scenarios to ensure that unnatural facial movements are excluded while switching between these scenarios as needed.

However, this poses practical challenges, such as preparing complex action scenarios beforehand, minimizing noticeable unnatural movements during transitions, and fine-tuning movements to subtly control the expressions conveyed.

In this study, lead author Hisashi Ishihara and his research group developed a dynamic facial expression synthesis technology using “waveform movements,” which represents various gestures that constitute facial movements, such as “breathing,” “blinking,” and “yawning,” as individual waves. These waves are propagated to the related facial areas and are overlaid to generate complex facial movements in real time. This method eliminates the need for the preparation of complex and diverse action data while also avoiding noticeable movement transitions.

Furthermore, by introducing “waveform modulation,” which adjusts the individual waveforms based on the robot's internal state, changes in internal conditions, such as mood, can be instantly reflected as variations in facial movements.

“Advancing this research in dynamic facial expression synthesis will enable robots capable of complex facial movements to exhibit more lively expressions and convey mood changes that respond to their surrounding circumstances, including interactions with humans,” says senior author Koichi Osuka. “This could greatly enrich emotional communication between humans and robots."

Ishihara adds, “Rather than creating superficial movements, further development of a system in which internal emotions are reflected in every detail of an android's actions could lead to the creation of androids perceived as having a heart."

By realizing the function to adaptively adjust and express emotions, this technology is expected to significantly enhance the value of communication robots, allowing them to exchange information with humans in a more natural, humanlike manner.



Movie: Introduction video of automatic generation of dynamic arousal expression on an android robot face
Original content, Credit must be given to the creator., Hisashi Ishihara

###
The article, “Automatic generation of dynamic arousal expression based on decaying wave synthesis for robot faces,” was published in Journal of Robotics and Mechatronics at DOI: https://doi.org/10.20965/jrm.2024.p1481

Title: Automatic generation of dynamic arousal expression based on decaying wave synthesis for robot faces
Journal: Journal of Robotics and Mechatronics
Authors: Hisashi Ishihara, Rina Hayashi, Francois Lavieille, Kaito Okamoto, Takahiro Okuyama, Koichi Osuka
DOI:10.20965/jrm.2024.p1481
Funded by: Japan Society for the Promotion of Science
Angehängte Dokumente
  • Fig. 1 Proposed system, Original content, Credit must be given to the creator., Hisashi Ishihara
  • Fig. 2 Snapshots of realized sleepy mood expression on a child android robot, Original content, Credit must be given to the creator., Hisashi Ishihara
23.12.2024 Osaka University
Regions: Asia, Japan
Keywords: Applied science, Artificial Intelligence

Disclaimer: AlphaGalileo is not responsible for the accuracy of news releases posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Referenzen

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Wir arbeiten eng zusammen mit...


  • BBC
  • The Times
  • National Geographic
  • The University of Edinburgh
  • University of Cambridge
  • iesResearch
Copyright 2024 by DNN Corp Terms Of Use Privacy Statement