Thai sign language translation using scale invariant feature transform and hidden markov models

Visual communication is important for a deft and/or mute person. It is also one of the tools for the communication between human and machines. In this paper, we develop an automatic Thai sign language translation system that is able to translate sign language that is not finger-spelling sign languag...

Full description

Saved in:
Bibliographic Details
Main Authors: Sansanee Auephanwiriyakul, Suwannee Phitakwinai, Wattanapong Suttapak, Phonkrit Chanda, Nipon Theera-Umpon
Format: Journal
Published: 2018
Subjects:
Online Access:https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84878059124&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/52444
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Chiang Mai University
Description
Summary:Visual communication is important for a deft and/or mute person. It is also one of the tools for the communication between human and machines. In this paper, we develop an automatic Thai sign language translation system that is able to translate sign language that is not finger-spelling sign language. In particular, we utilize Scale Invariant Feature Transform (SIFT) to match a test frame with observation symbols from keypoint descriptors collected in the signature library. These keypoint descriptors are computed from several keyframes that are recorded at different times of day for several days from five subjects. Hidden Markov Models (HMMs) are then used to translate observation sequences to words. We also collect Thai sign language videos from 20 subjects for testing. The system achieves approximately 86-95% for signer-dependent on the average, 79.75% for signer-semi-independent (same subjects used in the HMM training only) on the average and 76.56% for signer-independent on the average. These results are from the constrained system in which each signer wears a shirt with long sleeves in front of dark background. The unconstrained system in which each signer does not wear a long-sleeve shirt in front of various natural backgrounds yields a good result of around 74% on the average on the signer-independent experiment. The important feature of the proposed system is the consideration of shapes and positions of fingers, in addition to hand information. This feature provides the system ability to recognize the hand sign words that have similar gestures. © 2013 Elsevier B.V. All rights reserved.