Gesture-based interaction framework
Traditional human computer interaction usually involves the usage of mouse and keyboard as the input from the user. Natural interaction using gestures, postures and movement to communicate with the computer allows the interaction become more intuitive as the interaction is centralized around the us...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2013
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/52070 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Traditional human computer interaction usually involves the usage of mouse and keyboard as the input from the user. Natural interaction using gestures, postures and movement to communicate with the computer allows the interaction become more intuitive as the interaction is centralized around the users themselves. Out of many gestures, hand and finger gesture is a popular and one of the most used gestures to interact with each other. This project aims to explore and develop a gesture-based interaction framework to support and enhance the user experience in using or designing gesture-based interaction system. The focus of this project will be on the tracking of user hand gesture, as well as the research on finger detection algorithm.
The whole project was built on the Qt framework. OpenNI and PrimeSense NITE were utilized as the middleware to communicate with the sensor devices. Last but not least, the finger detection algorithm was implemented with the OpenCV libraries. |
---|