keyboard_arrow_up
An Intelligent Mobile Platform to Recognize and Translate Sign Language using Advanced Language Models and Machine Learning

Authors

Arlene Chang1 and Jonathan Sahagun2, 1USA, 2California State Polytechnic University, USA

Abstract

Sign language, the main language used by Deaf individuals, has been historically suppressed in favor of speech and oralism. Most hearing people also do not know sign language, thus creating a linguistic and cultural gap between Deaf and hearing communities today. This application proposes an American Sign Language (ASL) recognition model, consisting of a two-way translation assistant. One function uses machine learning to detect ASL handshapes to translate signs into English while the second function uses motion capture techniques to translate written English into ASL through a generated animated character. We additionally use ChatGPT's large language model for auto-completion, creating a prediction service to infer the given message. A large amount of training data was needed, given the variety of backgrounds, lighting, size, color, length, and width of human hands. The results demonstrate a 90% accuracy score. Although several other methods exist such as communication service groups for the Deaf or sensory augmentation technologies, our application uses an AI model to bridge verbal communication gaps, allowing Deaf individuals to overcome language barriers without needing to accommodate a dominantly hearing society. Both the hearing and Deaf individual can use their natural language in real-time for more emotional, personal, and effective communication.

Keywords

American Sign Language, Machine Learning, Translation, Flutter

Full Text  Volume 14, Number 17