Human-guided robot joints self-learning and transfer learning analytics

Style transfer techniques have seen wide adoption in recent years, with the CUT and CycleGAN network structures standing out for their superior accuracy and robustness compared to other methods. However, conventional style transfer methods rely on the structural characteristics of the content image...

Full description

Saved in:
Bibliographic Details
Main Author: Su, Haozun
Other Authors: Wen Bihan
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/168012
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-168012
record_format dspace
spelling sg-ntu-dr.10356-1680122023-07-04T15:15:29Z Human-guided robot joints self-learning and transfer learning analytics Su, Haozun Wen Bihan School of Electrical and Electronic Engineering bihan.wen@ntu.edu.sg Engineering::Electrical and electronic engineering Style transfer techniques have seen wide adoption in recent years, with the CUT and CycleGAN network structures standing out for their superior accuracy and robustness compared to other methods. However, conventional style transfer methods rely on the structural characteristics of the content image to guide the transfer, resulting in a style image that mimics the original image structure. To address this limitation, this study proposes a novel approach that utilizes mo- tion information of the human body as transfer information for style transfer networks. Specifically, the goal is to adapt the style of motion between a hu- man hand and a robotic arm, thereby generating a virtual robotic arm image that reflects the same motion style as the human hand. The proposed method combines depth curve estimation-based image processing techniques with style transfer to enable stable stylized operations on content images under varying lighting conditions, thereby enhancing the robustness of the overall approach. To evaluate the proposed method, a suite of quantitative metrics, including PSNR, SSIM, MAE, L1, FID, and LPIPS, are employed to analyze the test results of the model, both before and after optimization. Future research could potentially explore incorporating object recognition techniques to extract target objects for the model. This study’s approach holds positive implications for the field of robotics, especially in the area of robot training. Keywords: Style transfer, depth curve processing, robotics, image processing, computer vision, generative adversarial networks, contrastive unpaired translation, transfer learning, human perception. Master of Science (Computer Control and Automation) 2023-05-21T08:59:37Z 2023-05-21T08:59:37Z 2023 Thesis-Master by Coursework Su, H. (2023). Human-guided robot joints self-learning and transfer learning analytics. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/168012 https://hdl.handle.net/10356/168012 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Su, Haozun
Human-guided robot joints self-learning and transfer learning analytics
description Style transfer techniques have seen wide adoption in recent years, with the CUT and CycleGAN network structures standing out for their superior accuracy and robustness compared to other methods. However, conventional style transfer methods rely on the structural characteristics of the content image to guide the transfer, resulting in a style image that mimics the original image structure. To address this limitation, this study proposes a novel approach that utilizes mo- tion information of the human body as transfer information for style transfer networks. Specifically, the goal is to adapt the style of motion between a hu- man hand and a robotic arm, thereby generating a virtual robotic arm image that reflects the same motion style as the human hand. The proposed method combines depth curve estimation-based image processing techniques with style transfer to enable stable stylized operations on content images under varying lighting conditions, thereby enhancing the robustness of the overall approach. To evaluate the proposed method, a suite of quantitative metrics, including PSNR, SSIM, MAE, L1, FID, and LPIPS, are employed to analyze the test results of the model, both before and after optimization. Future research could potentially explore incorporating object recognition techniques to extract target objects for the model. This study’s approach holds positive implications for the field of robotics, especially in the area of robot training. Keywords: Style transfer, depth curve processing, robotics, image processing, computer vision, generative adversarial networks, contrastive unpaired translation, transfer learning, human perception.
author2 Wen Bihan
author_facet Wen Bihan
Su, Haozun
format Thesis-Master by Coursework
author Su, Haozun
author_sort Su, Haozun
title Human-guided robot joints self-learning and transfer learning analytics
title_short Human-guided robot joints self-learning and transfer learning analytics
title_full Human-guided robot joints self-learning and transfer learning analytics
title_fullStr Human-guided robot joints self-learning and transfer learning analytics
title_full_unstemmed Human-guided robot joints self-learning and transfer learning analytics
title_sort human-guided robot joints self-learning and transfer learning analytics
publisher Nanyang Technological University
publishDate 2023
url https://hdl.handle.net/10356/168012
_version_ 1772826556043886592