Facial expressions are not the only way we express our emotions - body language also plays a crucial role. The way our body moves when laughing at a comedy show is different from how we move when reuniting with an old friend. Although both are expressions of happiness, finding common patterns across different situations is difficult. These patterns become particularly challenging to define when we consider cross-cultural differences as well.
Researchers from the Research Institute of Electrical Communication at Tohoku University (Japan) in collaboration with National Chung Cheng University (Taiwan), tackled these challenges by collecting body movements from different situations, emotions, and personal styles in a motion capture lab. Unlike previous databases that focused on simple and repetitive actions (e.g., walking sadly, walking happily, walking angrily), and were primarily based on Western populations, this new database aims to capture a wide range of emotional bodily expressions across various scenarios, with a focus on Japanese people born and raised in Japan.
To build this database, six professional performers were asked to recall their personal emotional experiences (e.g., situations that make them happy, angry, sad, surprised, fearful, disgusted or contemptuous) and act them out using full-body movement. Their performances were recorded using high-speed motion capture technology, in which the 3-D information of movements was captured as a video. The recordings were later transformed into simple stick figure-like animations.
Another approach is imitation learning, in which a robot learns by imitating motion measurement data from a human performing the same motion task. Although imitation learning is good at learning on stable environments, it struggles when faced with new situations or environments it hasn't encountered during training. Its ability to modify and navigate effectively becomes constrained by the narrow scope of its learned behaviors.
"We overcame many of the limitations of these two approaches by combining them," explains Mitsuhiro Hayashibe, a professor at Tohoku University's Graduate School of Engineering. "Imitation learning was used to train a CPG-like controller, and, instead of applying deep learning to the CPGs itself, we applied it to a form of a reflex neural network that supported the CPGs."
Study participants watched these animations and judged the perceived emotion expressed. The results showed that they could recognize the emotions solely from body movements, and the accuracy was as good as past simple action databases. Among the emotions studied, anger and fear were the easiest to recognize, while contempt and disgust were the hardest.
"This is a crucial first step in developing a diverse, Asian-based bodily movement database to support research on emotional communication," remarks Dr. Miao Cheng, an Assistant Professor at Tohoku University.
"We are extremely fortunate to have the support of a local musical company based in Sendai, SCS musical company, to fully support this endeavor from the moment performers were recruited and throughout their performances. Without the professional help provided by Mr. Jun Hirose and Mr. Kazumasa Fujita from SCS musical company, it would have been very difficult to properly bridge art and science in this study," said Dr. Chia-huei Tseng, an Associate Professor at Tohoku University.
Future research can use this database to offer new insights into cultural variations in nonverbal communication. This will be helpful in developing technology communication tools to facilitate cross-cultural communication and understanding. This resource is expected to benefit researchers and benefit industries utilizing motion capture techniques such as gaming, film, animation, and virtual reality.
The findings were published in Behavior Research Methods, the official journal of The Psychonomic Society on December 10, 2024.
Professor Yoshifumi Kitamura, the leader of the research team and the director of the Interdisciplinary ICT Research Center for Cyber and Real Spaces, received research funding from New Energy and Industrial Technology Development Organization (NEDO) in Japan.