2020-480 A Fully Integrated Stretchable Sensor Array for Wearable Sign Language Translation to Voice

SUMMARY:

UCLA researchers in the Department of Bioengineering have developed a novel machine learning assisted wearable sensor system for the direct translation of sign language into voice with high performance.

BACKGROUND:

For the large community of deaf signers around the world who rely on sign language for conversations, communication with people who are unfamiliar with sign language is a challenge. A number of sign language translating devices have been developed using surface electromyography (SEMG) techniques (e.g. the piezoresistive effect, ionic conduction, the capacitive effect, etc.) and photography and image processing. The production and use of these translators are limited by issues such as the position for the worn sensors for SEMG based translation and lighting conditions for vision-based translation. Moreover, most of the translation systems convert sign language into text, making it inconvenient for practical communication. There is a need for a stable, accurate, and portable sign language translation system that can directly convert sign language into voice for better communication between the signers and non-signers.

INNOVATION:

UCLA researchers in the Department of Bioengineering have developed an integrated stretchable sensor array (ISSA) system for real-time translation of sign language into voice. Sensors are integrated into the fingers of a glove and the analog signals generated by each finger are processed into digital signals which are subsequently translated to voice. The ISSA system has been successfully prototyped and used to demonstrate the 660 sign language gestures recognition patterns based on American Sign Language, with a recognition rate of 98.63% and a rapid translation of <1 second.

POTENTIAL APPLICATIONS:

  • Sign language translation
  • General body and gesture recognition
  • Robotics monitoring and development
  • Remote control

ADVANTAGES:

  • Mechanical and chemical durability
  • Flexible material
  • High sensitivity (2.07 V)
  • Quick response time (<15 ms)

DEVELOPMENT TO DATE:

Prototyped tested with 660 sign language gestures recognition patterns at a 98.63% accuracy

RELATED PAPERS/NEWS ARTICLES:

https://newsroom.ucla.edu/releases/glove-translates-sign-language-to-speech

DEMONSTRATION VIDEO:
UCLA TDG YouTube: Fully Integrated Stretchable Sensor Array for Sign Language Translation (2020-480)
ASL glove Supplementary Video 2

Patent Information:
For More Information:
Ed Beres
Business Development Officer
edward.beres@tdg.ucla.edu
Inventors:
Jun Chen