Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity

With the significant rise of Metaverse due to facebook’s attempt to create the “Horizon” Metaverse and changing their brand name to “Meta”, virtual avatars have also gained importance offering alternative digital identities. However transitioning from the 2D web and mobile interfaces to embody the 3...

Full description

Saved in:
Bibliographic Details
Main Author: Chua, Zeta Hui Shi
Other Authors: Dusit Niyato
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/166912
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-166912
record_format dspace
spelling sg-ntu-dr.10356-1669122023-05-19T15:37:35Z Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity Chua, Zeta Hui Shi Dusit Niyato School of Computer Science and Engineering Bryan Lim Wei Yang DNIYATO@ntu.edu.sg, bryan.limwy@ntu.edu.sg Engineering::Computer science and engineering With the significant rise of Metaverse due to facebook’s attempt to create the “Horizon” Metaverse and changing their brand name to “Meta”, virtual avatars have also gained importance offering alternative digital identities. However transitioning from the 2D web and mobile interfaces to embody the 3D static avatar profiles in the metaverse can become a huge user interface challenge when shifting from Web 2.0 to the unfamiliar Web 3.0 and thereby hindering social interactions. To overcome these user adoption issues into the metaverse, Motion capture technology can be introduced into these avatars to map the realtime facial and body movements of a user’s physical body onto virtual avatars. Motion capture is a subset of augmented reality and machine learning. WebGL Motion capture applications are the most cost effective and accessible as compared to VR and AR Motion capture applications in todays age due to VR and AR headsets costing up to thousands of dollars whereas WebGL applications simply can be used via a laptop and webcamera. However there is a lack of sustainable and robust motion capture libraries to create such WebGL Motion capture applications and thus decreasing the accessibility of bridging the current consumer market to the Metaverse. Therefore our solution is to implement a sustainable and robust motion capture pipeline for developers to utilize, specifically for Unity, to easily add motion capture technology to metaverse avatars and thereby improving virtual social interactions between avatars. This can be done by automating the process of transfering Tensorflow.js based motion capture data directly to a Unity 3D environment, thereby reducing the friction of developing immersive and interactive Metaverse applications. Keywords: Motion Capture, Human Computer Interface, Unity, Web Augmented Reality, Metaverse, Unity, Tensorflow.js, PoseNet, Face Mesh, Kalidoface Bachelor of Engineering (Computer Science) 2023-05-18T08:48:09Z 2023-05-18T08:48:09Z 2023 Final Year Project (FYP) Chua, Z. H. S. (2023). Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/166912 https://hdl.handle.net/10356/166912 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
spellingShingle Engineering::Computer science and engineering
Chua, Zeta Hui Shi
Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity
description With the significant rise of Metaverse due to facebook’s attempt to create the “Horizon” Metaverse and changing their brand name to “Meta”, virtual avatars have also gained importance offering alternative digital identities. However transitioning from the 2D web and mobile interfaces to embody the 3D static avatar profiles in the metaverse can become a huge user interface challenge when shifting from Web 2.0 to the unfamiliar Web 3.0 and thereby hindering social interactions. To overcome these user adoption issues into the metaverse, Motion capture technology can be introduced into these avatars to map the realtime facial and body movements of a user’s physical body onto virtual avatars. Motion capture is a subset of augmented reality and machine learning. WebGL Motion capture applications are the most cost effective and accessible as compared to VR and AR Motion capture applications in todays age due to VR and AR headsets costing up to thousands of dollars whereas WebGL applications simply can be used via a laptop and webcamera. However there is a lack of sustainable and robust motion capture libraries to create such WebGL Motion capture applications and thus decreasing the accessibility of bridging the current consumer market to the Metaverse. Therefore our solution is to implement a sustainable and robust motion capture pipeline for developers to utilize, specifically for Unity, to easily add motion capture technology to metaverse avatars and thereby improving virtual social interactions between avatars. This can be done by automating the process of transfering Tensorflow.js based motion capture data directly to a Unity 3D environment, thereby reducing the friction of developing immersive and interactive Metaverse applications. Keywords: Motion Capture, Human Computer Interface, Unity, Web Augmented Reality, Metaverse, Unity, Tensorflow.js, PoseNet, Face Mesh, Kalidoface
author2 Dusit Niyato
author_facet Dusit Niyato
Chua, Zeta Hui Shi
format Final Year Project
author Chua, Zeta Hui Shi
author_sort Chua, Zeta Hui Shi
title Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity
title_short Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity
title_full Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity
title_fullStr Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity
title_full_unstemmed Designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity
title_sort designing a sustainable and robust pipeline for integrating tensorflow motion capture models into unity
publisher Nanyang Technological University
publishDate 2023
url https://hdl.handle.net/10356/166912
_version_ 1772828551515471872