Hand Gesture Recognition of Static Letters American Sign Language (ASL) Using Deep Learning

  • Abdulwahab A. Abdulhussein university of technology-control and system eng
  • Firas A. Raheem

Abstract

An American Sign Language (ASL) is a complex language. It is depending on the special gesture stander of marks. These marks are represented by hands with assistance by facial expression and body posture. ASL is the main communication language of deaf and people who have hard hearing from North America and other parts of the world. In this paper, Gesture recognition is proposed of static ASL using Deep Learning. The contribution consists of two solutions to the problem. The first one is resized with Bicubic static ASL binary images. Besides that, good recognition results in of detection the boundary hand using the Robert edge detection method. The second solution is to classify the 24 alphabets static characters of ASL using Convolution Neural Network (CNN) and Deep Learning. The classification accuracy equals to 99.3 % and the error of loss function is 0.0002. According to 36 minutes with 15 seconds of elapsed time result and 100 iterations. The training is fast and gives the very good results, in comparison with other related works of CNN, SVM, and ANN for training.

Downloads

Download data is not yet available.
Published
2020-06-25
How to Cite
Abdulhussein, A. A., & Raheem , F. A. (2020). Hand Gesture Recognition of Static Letters American Sign Language (ASL) Using Deep Learning. Engineering and Technology Journal, 38(6A), 926-937. https://doi.org/10.30684/etj.v38i6A.533