See-through and spatial augmented reality - a novel framework for human-robot interaction

Autonomous and semi-autonomous mobile robots have been deployed to cooperate with humans in many industrial applications. These tasks require human and robot to communicate and present information quickly and effectively. Recent human-robot interfaces usually use a setup including a camera and a pro...

Full description

Saved in:
Bibliographic Details
Main Authors: Dinh, Quang Huy, Viatcheslav, Iastrebov, Seet, Gim Lee Gerald
Other Authors: School of Mechanical and Aerospace Engineering
Format: Conference or Workshop Item
Language:English
Published: 2017
Subjects:
Online Access:https://hdl.handle.net/10356/83720
http://hdl.handle.net/10220/42779
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-83720
record_format dspace
spelling sg-ntu-dr.10356-837202020-11-01T04:43:56Z See-through and spatial augmented reality - a novel framework for human-robot interaction Dinh, Quang Huy Viatcheslav, Iastrebov Seet, Gim Lee Gerald School of Mechanical and Aerospace Engineering Interdisciplinary Graduate School (IGS) 2017 3rd International Conference on Control, Automation and Robotics (ICCAR) Robotics Research Centre Human-robot interaction Augmented reality Autonomous and semi-autonomous mobile robots have been deployed to cooperate with humans in many industrial applications. These tasks require human and robot to communicate and present information quickly and effectively. Recent human-robot interfaces usually use a setup including a camera and a projector attached to the mobile robot to project the information to the floor or to the wall during the interaction process. However, there are some limitations to these interfaces. First, using a projector for projecting information seems to be fine for an indoor application. On the contrary, it is very difficult or even impossible for users to view this source of information in outdoor contexts. This makes the current framework inappropriate for many outdoor industrial tasks. Secondly, as the projector is the only device for exchanging information between human and robot, the human-robot interacting process is insecure and people who work in the same environment can control the robot in the same manner as the main operator. Finally, the current interfaces normally use mouse, keyboard or a teach pendant to provide task information to the robot. This approach poses some difficulties if the main operator is working in an industrial context where he is supposed to wear protective equipment such as gloves or helmets which make it hard to control a mouse or to type on a keyboard. This work proposes a new interface framework for human - computer interaction in industry that can overcome the current limitations of previous works. The framework uses a laser-writer instead of a projector which is suitable for both indoor and outdoor applications. Furthermore, the combination of see-through head-mounted display augmented reality and spatial augmented reality would provide the system a novel way to enhance the security level of exchanging information since the system now can separate the information presenting to the main user and to people working in the same environment. Finally, a novel hand-held device is incorporated to the framework which provides various input modalities for users to interact with the mobile robot. The device will allows the elimination of mouse and keyboard or teach pendants in industrial contexts. NRF (Natl Research Foundation, S’pore) Accepted version 2017-07-03T02:44:39Z 2019-12-06T15:28:44Z 2017-07-03T02:44:39Z 2019-12-06T15:28:44Z 2017 Conference Paper Dinh, Q. H., Viatcheslav, I., & Seet, G. L. G. (2017). See-through and spatial augmented reality - a novel framework for human-robot interaction. 2017 3rd International Conference on Control, Automation and Robotics (ICCAR), 719-726. https://hdl.handle.net/10356/83720 http://hdl.handle.net/10220/42779 10.1109/ICCAR.2017.7942791 en © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [http://dx.doi.org/10.1109/ICCAR.2017.7942791]. 9 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Human-robot interaction
Augmented reality
spellingShingle Human-robot interaction
Augmented reality
Dinh, Quang Huy
Viatcheslav, Iastrebov
Seet, Gim Lee Gerald
See-through and spatial augmented reality - a novel framework for human-robot interaction
description Autonomous and semi-autonomous mobile robots have been deployed to cooperate with humans in many industrial applications. These tasks require human and robot to communicate and present information quickly and effectively. Recent human-robot interfaces usually use a setup including a camera and a projector attached to the mobile robot to project the information to the floor or to the wall during the interaction process. However, there are some limitations to these interfaces. First, using a projector for projecting information seems to be fine for an indoor application. On the contrary, it is very difficult or even impossible for users to view this source of information in outdoor contexts. This makes the current framework inappropriate for many outdoor industrial tasks. Secondly, as the projector is the only device for exchanging information between human and robot, the human-robot interacting process is insecure and people who work in the same environment can control the robot in the same manner as the main operator. Finally, the current interfaces normally use mouse, keyboard or a teach pendant to provide task information to the robot. This approach poses some difficulties if the main operator is working in an industrial context where he is supposed to wear protective equipment such as gloves or helmets which make it hard to control a mouse or to type on a keyboard. This work proposes a new interface framework for human - computer interaction in industry that can overcome the current limitations of previous works. The framework uses a laser-writer instead of a projector which is suitable for both indoor and outdoor applications. Furthermore, the combination of see-through head-mounted display augmented reality and spatial augmented reality would provide the system a novel way to enhance the security level of exchanging information since the system now can separate the information presenting to the main user and to people working in the same environment. Finally, a novel hand-held device is incorporated to the framework which provides various input modalities for users to interact with the mobile robot. The device will allows the elimination of mouse and keyboard or teach pendants in industrial contexts.
author2 School of Mechanical and Aerospace Engineering
author_facet School of Mechanical and Aerospace Engineering
Dinh, Quang Huy
Viatcheslav, Iastrebov
Seet, Gim Lee Gerald
format Conference or Workshop Item
author Dinh, Quang Huy
Viatcheslav, Iastrebov
Seet, Gim Lee Gerald
author_sort Dinh, Quang Huy
title See-through and spatial augmented reality - a novel framework for human-robot interaction
title_short See-through and spatial augmented reality - a novel framework for human-robot interaction
title_full See-through and spatial augmented reality - a novel framework for human-robot interaction
title_fullStr See-through and spatial augmented reality - a novel framework for human-robot interaction
title_full_unstemmed See-through and spatial augmented reality - a novel framework for human-robot interaction
title_sort see-through and spatial augmented reality - a novel framework for human-robot interaction
publishDate 2017
url https://hdl.handle.net/10356/83720
http://hdl.handle.net/10220/42779
_version_ 1683494464901873664