Augmented reality interface for taping robot

Applying masking tape to a particular area is a very important step for protecting an uninvolved surface in processes like mechanical part repairing or surface protection. In the past, the task was very time-consuming and required a lot of manual works. In recent years, with some advances in the fie...

Full description

Saved in:
Bibliographic Details
Main Authors: Dinh, Huy, Yuan, Quilong, Viatcheslav, Iastrebov, Seet, Gerald
Other Authors: School of Mechanical and Aerospace Engineering
Format: Conference or Workshop Item
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/10356/138181
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-138181
record_format dspace
spelling sg-ntu-dr.10356-1381812020-09-26T21:52:41Z Augmented reality interface for taping robot Dinh, Huy Yuan, Quilong Viatcheslav, Iastrebov Seet, Gerald School of Mechanical and Aerospace Engineering Interdisciplinary Graduate School (IGS) 2017 18th International Conference on Advanced Robotics (ICAR) Robotics Research Centre Engineering::Mechanical engineering Human-Robot Interaction Laser Writer Applying masking tape to a particular area is a very important step for protecting an uninvolved surface in processes like mechanical part repairing or surface protection. In the past, the task was very time-consuming and required a lot of manual works. In recent years, with some advances in the fields of automatic robotic system and computer vision, the task now can be completed with the help of an automatic taping system containing a 3D scanner, a manipulator and a rotating platform. This implementation has been proved to provide better quality and be at least twice as fast as comparing to the work done by a human operator. However, there are still some limitations of this setup. First, it is difficult for the user to monitor the taping process since the system uses the 3D scanner to reconstruct the surface model and there is no calibrated projector to overlay the manipulator's trajectory over the real surface. Second, the main user is supposed to use a computer with keyboard and mouse to identify the area for masking which requires some expert knowledge and might not be appropriate in an industrial context where people wear protective equipment such as gloves or helmet. This paper introduces the use of spatial augmented reality technology and wearable device in the semi-automatic taping robotic system and the related calibration algorithms to enhance the user experience. The framework and its components are presented, with a case study and some results. NRF (Natl Research Foundation, S’pore) Accepted version 2020-04-28T02:21:38Z 2020-04-28T02:21:38Z 2017 Conference Paper Dinh, H., Yuan, Q., Vietcheslav, I., & Seet, G. (2017). Augmented reality interface for taping robot. Proceedings of the 2017 18th International Conference on Advanced Robotics (ICAR). doi:10.1109/ICAR.2017.8023530 978-1-5386-3158-4 https://hdl.handle.net/10356/138181 10.1109/ICAR.2017.8023530 en © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/ICAR.2017.8023530. application/pdf
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic Engineering::Mechanical engineering
Human-Robot Interaction
Laser Writer
spellingShingle Engineering::Mechanical engineering
Human-Robot Interaction
Laser Writer
Dinh, Huy
Yuan, Quilong
Viatcheslav, Iastrebov
Seet, Gerald
Augmented reality interface for taping robot
description Applying masking tape to a particular area is a very important step for protecting an uninvolved surface in processes like mechanical part repairing or surface protection. In the past, the task was very time-consuming and required a lot of manual works. In recent years, with some advances in the fields of automatic robotic system and computer vision, the task now can be completed with the help of an automatic taping system containing a 3D scanner, a manipulator and a rotating platform. This implementation has been proved to provide better quality and be at least twice as fast as comparing to the work done by a human operator. However, there are still some limitations of this setup. First, it is difficult for the user to monitor the taping process since the system uses the 3D scanner to reconstruct the surface model and there is no calibrated projector to overlay the manipulator's trajectory over the real surface. Second, the main user is supposed to use a computer with keyboard and mouse to identify the area for masking which requires some expert knowledge and might not be appropriate in an industrial context where people wear protective equipment such as gloves or helmet. This paper introduces the use of spatial augmented reality technology and wearable device in the semi-automatic taping robotic system and the related calibration algorithms to enhance the user experience. The framework and its components are presented, with a case study and some results.
author2 School of Mechanical and Aerospace Engineering
author_facet School of Mechanical and Aerospace Engineering
Dinh, Huy
Yuan, Quilong
Viatcheslav, Iastrebov
Seet, Gerald
format Conference or Workshop Item
author Dinh, Huy
Yuan, Quilong
Viatcheslav, Iastrebov
Seet, Gerald
author_sort Dinh, Huy
title Augmented reality interface for taping robot
title_short Augmented reality interface for taping robot
title_full Augmented reality interface for taping robot
title_fullStr Augmented reality interface for taping robot
title_full_unstemmed Augmented reality interface for taping robot
title_sort augmented reality interface for taping robot
publishDate 2020
url https://hdl.handle.net/10356/138181
_version_ 1681056698282803200