Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity

With the significant rise of Metaverse due to facebook’s attempt to create the “Horizon” Metaverse and changing their brand name to “Meta”, virtual avatars have also gained importance offering alternative digital identities. However transitioning from the 2D web and mobile interfaces to embody the 3...

Full description

Saved in:
Bibliographic Details
Main Author: Chua, Zeta Hui Shi
Other Authors: Dusit Niyato
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/166912
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:With the significant rise of Metaverse due to facebook’s attempt to create the “Horizon” Metaverse and changing their brand name to “Meta”, virtual avatars have also gained importance offering alternative digital identities. However transitioning from the 2D web and mobile interfaces to embody the 3D static avatar profiles in the metaverse can become a huge user interface challenge when shifting from Web 2.0 to the unfamiliar Web 3.0 and thereby hindering social interactions. To overcome these user adoption issues into the metaverse, Motion capture technology can be introduced into these avatars to map the realtime facial and body movements of a user’s physical body onto virtual avatars. Motion capture is a subset of augmented reality and machine learning. WebGL Motion capture applications are the most cost effective and accessible as compared to VR and AR Motion capture applications in todays age due to VR and AR headsets costing up to thousands of dollars whereas WebGL applications simply can be used via a laptop and webcamera. However there is a lack of sustainable and robust motion capture libraries to create such WebGL Motion capture applications and thus decreasing the accessibility of bridging the current consumer market to the Metaverse. Therefore our solution is to implement a sustainable and robust motion capture pipeline for developers to utilize, specifically for Unity, to easily add motion capture technology to metaverse avatars and thereby improving virtual social interactions between avatars. This can be done by automating the process of transfering Tensorflow.js based motion capture data directly to a Unity 3D environment, thereby reducing the friction of developing immersive and interactive Metaverse applications. Keywords: Motion Capture, Human Computer Interface, Unity, Web Augmented Reality, Metaverse, Unity, Tensorflow.js, PoseNet, Face Mesh, Kalidoface