Reading signs: New method improves AI translation of sign language
en-GBde-DEes-ESfr-FR

Reading signs: New method improves AI translation of sign language


Sign languages have been developed by nations around the world to fit the local communication style, and each language consists of thousands of signs. This has made sign languages difficult to learn and understand. Using artificial intelligence to automatically translate the signs into words, known as word-level sign language recognition, has now gained a boost in accuracy through the work of an Osaka Metropolitan University-led research group.

Previous research methods have been focused on capturing information about the signer’s general movements. The problems in accuracy have stemmed from the different meanings that could arise based on the subtle differences in hand shape and relationship in the position of the hands and the body.

Graduate School of Informatics Associate Professor Katsufumi Inoue and Associate Professor Masakazu Iwamura worked with colleagues including at the Indian Institute of Technology Roorkee to improve AI recognition accuracy. They added data such as hand and facial expressions, as well as skeletal information on the position of the hands relative to the body, to the information on the general movements of the signer’s upper body.

“We were able to improve the accuracy of word-level sign language recognition by 10-15% compared to conventional methods,” Professor Inoue declared. “In addition, we expect that the method we have proposed can be applied to any sign language, hopefully leading to improved communication with speaking- and hearing-impaired people in various countries.”

The findings were published in IEEE Access.

Funding
This work was supported by JSPS KAKENHI JP19K12023.

###

About OMU
Established in Osaka as one of the largest public universities in Japan, Osaka Metropolitan University is committed to shaping the future of society through “Convergence of Knowledge” and the promotion of world-class research. For more research news, visit https://www.omu.ac.jp/en/ and follow us on social media: X, Facebook, Instagram, LinkedIn.
Journal: IEEE Access
Title: Word-Level Sign Language Recognition With Multi-Stream Neural Networks Focusing on Local Regions and Skeletal Information
DOI: 10.1109/ACCESS.2024.3494878
Author(s): Mizuki Maruyama, Shrey Singh, Katsufumi Inoue, Partha Pratim Roy, Masakazu Iwamura, and Michifumi Yoshioka
Publication date: 11 November 2024
URL: https://doi.org/10.1109/ACCESS.2024.3494878
Archivos adjuntos
  • Improving AI accuracy: Adding data such as hand and facial expressions, as well as skeletal information on the position of the hands relative to the body, to the information on the general movements of the signer’s upper body improves word recognition. Credit: Osaka Metropolitan University
Regions: Asia, Japan
Keywords: Applied science, Artificial Intelligence, Computing, People in technology & industry

Disclaimer: AlphaGalileo is not responsible for the accuracy of news releases posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonios

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Trabajamos en estrecha colaboración con...


  • BBC
  • The Times
  • National Geographic
  • The University of Edinburgh
  • University of Cambridge
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement