Automatic sign language detector for video call
Video conference has been a big part of our lives since COVID-19 hit but the hearing-impaired does not have the ability to communicate in an efficient way when video conferencing. Singapore Association For The Deaf (SADeaf) state that there was a rise of interest to learn sign language for commun...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/148038 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Video conference has been a big part of our lives since COVID-19 hit but the
hearing-impaired does not have the ability to communicate in an efficient way when
video conferencing. Singapore Association For The Deaf (SADeaf) state that there was a
rise of interest to learn sign language for communication with hearing-impaired family
member or co-workers. However, there is a steep learning curve for learning sign
language. This project aims to allow real-time interpretation of sign language using
You-Only-Look-Once(YOLO) neural networks. The application will be designed to
output the word visually and audibly when the user uses sign language on their web
camera while video conferencing. |
---|