Gesture based 3D mesh modeling

The user interface is an integral component in the way people use computers. User interface have evolved from the text-based user interfaces of the past to the now popular window-based user interface. But one of these are how humans intuitively interact with objects. Humans tend to used their hands...

Full description

Saved in:
Bibliographic Details
Main Authors: Carlos, Roland E., Dalan, Clarence, Jr., Sanchez, Aaron D., Tolentino, Kevin
Format: text
Language:English
Published: Animo Repository 2012
Online Access:https://animorepository.dlsu.edu.ph/etd_bachelors/10599
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: De La Salle University
Language: English
Description
Summary:The user interface is an integral component in the way people use computers. User interface have evolved from the text-based user interfaces of the past to the now popular window-based user interface. But one of these are how humans intuitively interact with objects. Humans tend to used their hands when trying to interact or express their ideas. One way to simulate this intuitive process is by using sketches or finger strokes as input for machines to process instead of the usual windows and pointer interface. This research focused on the implementation of a 3D modeling interface that utilizes gesture based input. We created a system for Android tablet PCs that accept touch input to create and manipulate 3D geometries. The system supports creation and deletion of 3D primitives such as vertices, edges, faces and meshes, as well as manipulating them using modeling and viewing transformation, and loading and saving 3D geometries. The 3D modeling process is adopted onto the modern user interface of tablet PCs and can emulate the human process of drawing as an alternative to the window and pointer-based 3D modeling tools that are more commonly available.