Understanding hand sign language in human-computer interaction

In this project, an English to American Sign Language translator cum American Sign Language learning application, Let’s Talk, had been successfully developed using the main character as Sydney, the 3-D human figure. The objective of this application is to use as a communication tool for the user...

Full description

Saved in:
Bibliographic Details
Main Author: Er, Alvin Beng Kiong
Other Authors: Yap Kim Hui
Format: Final Year Project
Language:English
Published: 2009
Subjects:
Online Access:http://hdl.handle.net/10356/15764
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-15764
record_format dspace
spelling sg-ntu-dr.10356-157642023-07-07T17:23:15Z Understanding hand sign language in human-computer interaction Er, Alvin Beng Kiong Yap Kim Hui School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems In this project, an English to American Sign Language translator cum American Sign Language learning application, Let’s Talk, had been successfully developed using the main character as Sydney, the 3-D human figure. The objective of this application is to use as a communication tool for the user to communicate with a hearing impaired person. The word or sentence for translation can be typed on the GUI using the keyboard or a speech recognition technology. Quiz is also implemented as a learning tool for the user. Before the design and construction of the software, the software process model was derived and the Incremental Model was chosen. Following the flow of the Incremental Model, the project was divided into two phase. The first is the prototype phase and the second is the enhancement phase. During the software development, Visual Basic 6.0 had been used for programming and GUI creation. The GUI is designed to be user-friendly and interactive. Sydney was created using Poser 7 which is a dynamic three-dimensional rendering and animation software tools produced by Smith Micro Software. Much research and planning had been done before proceeding to the construction of the animation video and other databases. During the development process, many hours had been spent on the manipulating Sydney to sign ASL appropriately and creating the appropriate animation video for the GUI. Due to the large file size of the raw video produced by Poser 7, another application, Auto Gordian Knot, had been used to convert and compress raw AVI video into AVI video with XviD codec. Speech recognition software, Dragon NaturallySpeaking 10, was also successfully implemented to work together with the application. Text to speech was later introduced using Cepstral Diane US English voice and the extraction of the voice was done using the TextToWav application. Through the help of many software application tools, the database of Let’s Talk had been successfully created. Bachelor of Engineering 2009-05-14T04:22:08Z 2009-05-14T04:22:08Z 2009 2009 Final Year Project (FYP) http://hdl.handle.net/10356/15764 en Nanyang Technological University 94 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems
spellingShingle DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Er, Alvin Beng Kiong
Understanding hand sign language in human-computer interaction
description In this project, an English to American Sign Language translator cum American Sign Language learning application, Let’s Talk, had been successfully developed using the main character as Sydney, the 3-D human figure. The objective of this application is to use as a communication tool for the user to communicate with a hearing impaired person. The word or sentence for translation can be typed on the GUI using the keyboard or a speech recognition technology. Quiz is also implemented as a learning tool for the user. Before the design and construction of the software, the software process model was derived and the Incremental Model was chosen. Following the flow of the Incremental Model, the project was divided into two phase. The first is the prototype phase and the second is the enhancement phase. During the software development, Visual Basic 6.0 had been used for programming and GUI creation. The GUI is designed to be user-friendly and interactive. Sydney was created using Poser 7 which is a dynamic three-dimensional rendering and animation software tools produced by Smith Micro Software. Much research and planning had been done before proceeding to the construction of the animation video and other databases. During the development process, many hours had been spent on the manipulating Sydney to sign ASL appropriately and creating the appropriate animation video for the GUI. Due to the large file size of the raw video produced by Poser 7, another application, Auto Gordian Knot, had been used to convert and compress raw AVI video into AVI video with XviD codec. Speech recognition software, Dragon NaturallySpeaking 10, was also successfully implemented to work together with the application. Text to speech was later introduced using Cepstral Diane US English voice and the extraction of the voice was done using the TextToWav application. Through the help of many software application tools, the database of Let’s Talk had been successfully created.
author2 Yap Kim Hui
author_facet Yap Kim Hui
Er, Alvin Beng Kiong
format Final Year Project
author Er, Alvin Beng Kiong
author_sort Er, Alvin Beng Kiong
title Understanding hand sign language in human-computer interaction
title_short Understanding hand sign language in human-computer interaction
title_full Understanding hand sign language in human-computer interaction
title_fullStr Understanding hand sign language in human-computer interaction
title_full_unstemmed Understanding hand sign language in human-computer interaction
title_sort understanding hand sign language in human-computer interaction
publishDate 2009
url http://hdl.handle.net/10356/15764
_version_ 1772826898242469888