Transactions on Machine Learning and Data Mining (ISSN: 1865-6781)

Volume 10 - Number 2 - October 2017 - Pages 41-55

American Sign Language Recognition using Hidden Markov Models and Wearable Motion Sensors

Rabeet Fatmi¹, Sherif Rashad¹ ², Ryan Integlia², Gabriel Hutchison¹

ܻCollege of Innovation & Technology, Florida Polytechnic University, USA ²College of Engineering, Florida Polytechnic University, USA


In this paper, we propose an efficient and non-invasive solution to translate American Sign Language (ASL) to speech utilizing two wearable armbands called Myo. The compact Myo armbands that are used in this study are much more practical than existing solutions, which include glove-based techniques, camera-based systems, and the use of 3D depth sensors. We applied the Gaussian Mixture Model Hidden Markov Model (GMM-HMM) technique to achieve classification rates of up to 96.15% for ASL words (gestures). The HMM-based approach also sets a solid foundation for future work on the system, which includes continuous ASL recognition as well as signer independence.

Keywords:Sign language recognition, Hidden Markov Model (HMM), American sign language (ASL), Human Computer Interaction (HCI)

PDFDownload Paper (379 KB)

Back to Table of Contents