Authors
Abdelmoty M.Ahmed, Reda Abo Alez, Muhammad Taha and Gamal Tharwat, Al-Azhar University, Egypt
Abstract
Sign language continues to be the preferred tool of communication between the deaf and the hearing-impaired. It is a well-structured code by hand gesture, where every gesture has a specific meaning, In this paper has goal to develop a system for automatic translation of Arabic Sign Language. To Arabic Text (ATASAT) System this system is acts as a translator among deaf and dumb with normal people to enhance their communication, the proposed System consists of five main stages Video and Images capture, Video and images processing, Hand Signs Construction, Classification finally Text transformation and interpretation, this system depends on building a two datasets image features for Arabic sign language gestures alphabets from two resources: Arabic Sign Language dictionary and gestures from different signer's human, also using gesture recognition techniques, which allows the user to interact with the outside world. This system offers a novel technique of hand detection is proposed which detect and extract hand gestures of Arabic Sign from Image or video, in this paper we use a set of appropriate features in step hand sign construction and classification of based on different classification algorithms such as KNN, MLP, C4.5, VFI and SMO and compare these results to get better classifier.
Keywords
Sign language, Hand Gesture, Hand Signs Construction, gesture recognition, hand detection