Flying mouse project
A wrist-worn standalone device measuring three principal accelerations along three orthogonal axes, x, y and z, is capable of communicating these values wirelessly to a computer. These acceleration values are sufficient to infer the orientation of the wearer's arm. This forms a facility for the...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2014
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/59747 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-59747 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-597472023-07-07T17:31:33Z Flying mouse project Janis Abols Yu Yajun School of Electrical and Electronic Engineering Centre for Integrated Circuits and Systems DRNTU::Engineering::Computer science and engineering::Software::Software engineering A wrist-worn standalone device measuring three principal accelerations along three orthogonal axes, x, y and z, is capable of communicating these values wirelessly to a computer. These acceleration values are sufficient to infer the orientation of the wearer's arm. This forms a facility for the design of a mouse-pointer controlling software based on the inferred hand position and movement. Additionally, a hand gesture performed by the wearer is also described with acceleration values, which are sufficient to distinguish between various different gestures. This project develops software that allows for real-time control of the mouse pointer and real-time gesture recognition, to perform user-defined tasks. The gestures and their corresponding tasks are fully defined by the user of the software. Mouse pointer control is achieved by means of monitoring the orientation of the device in 3D, meanwhile the gesture recognition is achieved by monitoring the total acceleration of the device, and maintaining a number of past acceleration values in a data buffer. Upon exceeding a preset total acceleration threshold (signifying the presence of a hand gesture), the collected 3-axis data is captured and analysed to be potentially recognised as a gesture. Gesture classification, learning and recognition is achieved by using a multi-layer-perceptron, i.e. an artificial neural network, which works as an automatic data classifier, which does not require supervised learning, i.e. is capable of learning patterns independently. Actions to be performed upon recognising a gesture are also user-defined and very flexible in nature. These are executed upon recognising a gesture with a preset confidence level. Finally, the software is capable of maintaining several user-defined gesture-action profiles that can be cycled through by the user and selected as required, without stopping wireless interaction with the computer. Bachelor of Engineering 2014-05-14T01:58:05Z 2014-05-14T01:58:05Z 2014 2014 Final Year Project (FYP) http://hdl.handle.net/10356/59747 en Nanyang Technological University 51 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Computer science and engineering::Software::Software engineering |
spellingShingle |
DRNTU::Engineering::Computer science and engineering::Software::Software engineering Janis Abols Flying mouse project |
description |
A wrist-worn standalone device measuring three principal accelerations along three orthogonal axes, x, y and z, is capable of communicating these values wirelessly to a computer. These acceleration values are sufficient to infer the orientation of the wearer's arm. This forms a facility for the design of a mouse-pointer controlling software based on the inferred hand position and movement. Additionally, a hand gesture performed by the wearer is also described with acceleration values, which are sufficient to distinguish between various different gestures.
This project develops software that allows for real-time control of the mouse pointer and real-time gesture recognition, to perform user-defined tasks. The gestures and their corresponding tasks are fully defined by the user of the software.
Mouse pointer control is achieved by means of monitoring the orientation of the device in 3D, meanwhile the gesture recognition is achieved by monitoring the total acceleration of the device, and maintaining a number of past acceleration values in a data buffer. Upon exceeding a preset total acceleration threshold (signifying the presence of a hand gesture), the collected 3-axis data is captured and analysed to be potentially recognised as a gesture.
Gesture classification, learning and recognition is achieved by using a multi-layer-perceptron, i.e. an artificial neural network, which works as an automatic data classifier, which does not require supervised learning, i.e. is capable of learning patterns independently.
Actions to be performed upon recognising a gesture are also user-defined and very flexible in nature. These are executed upon recognising a gesture with a preset confidence level.
Finally, the software is capable of maintaining several user-defined gesture-action profiles that can be cycled through by the user and selected as required, without stopping wireless interaction with the computer. |
author2 |
Yu Yajun |
author_facet |
Yu Yajun Janis Abols |
format |
Final Year Project |
author |
Janis Abols |
author_sort |
Janis Abols |
title |
Flying mouse project |
title_short |
Flying mouse project |
title_full |
Flying mouse project |
title_fullStr |
Flying mouse project |
title_full_unstemmed |
Flying mouse project |
title_sort |
flying mouse project |
publishDate |
2014 |
url |
http://hdl.handle.net/10356/59747 |
_version_ |
1772826463469305856 |