Gesture-based interactive presentation platform

The different presentation software available today allow users to present in a limited manner since the presenters can now only use the mouse, keyboard, touch screen and remote control to manipulate their presentations. Because of this, the interactions that the presenter can do with the slides do...

Full description

Saved in:
Bibliographic Details
Main Authors: Arlanza, Michael Angelo A., Cruz, Edwin S., Nepomuceno, Juan Paolo B., Villegas, Patricia Beatrice V.
Format: text
Language:English
Published: Animo Repository 2012
Subjects:
Online Access:https://animorepository.dlsu.edu.ph/etd_bachelors/11465
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: De La Salle University
Language: English
id oai:animorepository.dlsu.edu.ph:etd_bachelors-12110
record_format eprints
spelling oai:animorepository.dlsu.edu.ph:etd_bachelors-121102022-03-11T01:41:50Z Gesture-based interactive presentation platform Arlanza, Michael Angelo A. Cruz, Edwin S. Nepomuceno, Juan Paolo B. Villegas, Patricia Beatrice V. The different presentation software available today allow users to present in a limited manner since the presenters can now only use the mouse, keyboard, touch screen and remote control to manipulate their presentations. Because of this, the interactions that the presenter can do with the slides do not resemble a person's natural interaction with other people and different objects in the environment. To address this problem, this research has implemented gesture recognition technology in a presentation platform using Microsoft Kinect. To do this, the researchers have gathered thousands of training data from different people using Microsoft Kinect to create a model that contains different gestures that can be integrated to the system. As such, the manner in which people normally interact with each other during a presentation will be manipulated since hand gestures can be recognized by the system. Also, users are able to interact with objects in the presentation using pre-defined hand gestures that correspond to different valid commands. Thus, gesture-based interactive presentation system will not limit users into actions such as next slide and previous slide since object interaction can include more complex commands such as resizing and rotating the pictures and other visual aids that appear in the presentation. To make this possible, two approaches were used by the researchers as feature for the gestures, one was to compute the distance of the points of the upper body from the head and neck, and the other one was to compute for the angle of the different upper body parts. After conducting tests on the approaches, the results have shown that the second approach yielded higher accuracy. 2012-01-01T08:00:00Z text https://animorepository.dlsu.edu.ph/etd_bachelors/11465 Bachelor's Theses English Animo Repository Computer Sciences
institution De La Salle University
building De La Salle University Library
continent Asia
country Philippines
Philippines
content_provider De La Salle University Library
collection DLSU Institutional Repository
language English
topic Computer Sciences
spellingShingle Computer Sciences
Arlanza, Michael Angelo A.
Cruz, Edwin S.
Nepomuceno, Juan Paolo B.
Villegas, Patricia Beatrice V.
Gesture-based interactive presentation platform
description The different presentation software available today allow users to present in a limited manner since the presenters can now only use the mouse, keyboard, touch screen and remote control to manipulate their presentations. Because of this, the interactions that the presenter can do with the slides do not resemble a person's natural interaction with other people and different objects in the environment. To address this problem, this research has implemented gesture recognition technology in a presentation platform using Microsoft Kinect. To do this, the researchers have gathered thousands of training data from different people using Microsoft Kinect to create a model that contains different gestures that can be integrated to the system. As such, the manner in which people normally interact with each other during a presentation will be manipulated since hand gestures can be recognized by the system. Also, users are able to interact with objects in the presentation using pre-defined hand gestures that correspond to different valid commands. Thus, gesture-based interactive presentation system will not limit users into actions such as next slide and previous slide since object interaction can include more complex commands such as resizing and rotating the pictures and other visual aids that appear in the presentation. To make this possible, two approaches were used by the researchers as feature for the gestures, one was to compute the distance of the points of the upper body from the head and neck, and the other one was to compute for the angle of the different upper body parts. After conducting tests on the approaches, the results have shown that the second approach yielded higher accuracy.
format text
author Arlanza, Michael Angelo A.
Cruz, Edwin S.
Nepomuceno, Juan Paolo B.
Villegas, Patricia Beatrice V.
author_facet Arlanza, Michael Angelo A.
Cruz, Edwin S.
Nepomuceno, Juan Paolo B.
Villegas, Patricia Beatrice V.
author_sort Arlanza, Michael Angelo A.
title Gesture-based interactive presentation platform
title_short Gesture-based interactive presentation platform
title_full Gesture-based interactive presentation platform
title_fullStr Gesture-based interactive presentation platform
title_full_unstemmed Gesture-based interactive presentation platform
title_sort gesture-based interactive presentation platform
publisher Animo Repository
publishDate 2012
url https://animorepository.dlsu.edu.ph/etd_bachelors/11465
_version_ 1728621118508498944