Robot programming using augmented reality
Current robot programming approaches lack the intuition required for quick and simple applications. As new robotic applications are being identified, there is a greater need to be able to programme robots safely and quickly. Recently, an approach known as Programming by Demonstration (PbD) has emerg...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Theses and Dissertations |
Language: | English |
Published: |
2014
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/60515 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-60515 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-605152020-11-01T11:39:11Z Robot programming using augmented reality Chong, Jonathan Wun Shiung Andrew Nee Yeh Ching Kamal Youcef-Toumi Singapore-MIT Alliance Programme DRNTU::Engineering::Manufacturing Current robot programming approaches lack the intuition required for quick and simple applications. As new robotic applications are being identified, there is a greater need to be able to programme robots safely and quickly. Recently, an approach known as Programming by Demonstration (PbD) has emerged primarily addressing intuitiveness in Human-Robot Interaction (HRI). This approach is complemented by another emerging technology known as Augmented Reality (AR) which is the environment where computer-generated 3D objects are blended (registered) onto a real world scene, to enhance the interaction of a user with the real world. The work in this thesis marries the two concepts above by proposing a new approach for immersive robot programming known as Robot Programming using AR (RPAR), where the user directly moves a virtual robot amongst real objects in an unknown environment. The first part of this work is the development of a new RPAR methodology aimed at applications such as pick-and-place tasks, where there are a number of possible path solutions for a given start and goal configuration pair. The ability of the user to make sense of an unknown environment is utilized by enabling him/her to intuitively define subsets of the free space that are relevant to his/her goals. Collision-free paths for userdefined start and goal configurations are then generated using a beam search strategy. The second part of this work is the development of a new RPAR methodology aimed at applications such as arc welding and laser cutting, where the end-effector of the robot is constrained to follow a user-defined 3D path at a certain orientation, consistently with respect to that path. The first step requires the user to perform a number of demonstrations to obtain data points of the path/curve to be followed. These data points are processed using a number of algorithms to produce an output curve that best represents the demonstrated curve, i.e., learning the human intention. The Piecewise Linear Parameterization (PLP) algorithm and a curve learning method based on Bayesian neural networks and reparameterization are proposed. After the output curve is generated, the user is required to interactively define the free space along the curve. The orientation of the end-effector can then be planned intuitively using visual feedback from AR, and a final collision-free path can be obtained. The two RPAR methodologies developed in this research have been applied to a number of case studies and the various related issues have been discussed. DOCTOR OF PHILOSOPHY (IMST) 2014-05-28T01:28:06Z 2014-05-28T01:28:06Z 2006 2006 Thesis Chong, J. W. S. (2006). Robot programming using augmented reality. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/60515 10.32657/10356/60515 en 192 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Manufacturing |
spellingShingle |
DRNTU::Engineering::Manufacturing Chong, Jonathan Wun Shiung Robot programming using augmented reality |
description |
Current robot programming approaches lack the intuition required for quick and simple applications. As new robotic applications are being identified, there is a greater need to be able to programme robots safely and quickly. Recently, an approach known as Programming by Demonstration (PbD) has emerged primarily addressing intuitiveness in Human-Robot Interaction (HRI). This approach is complemented by another emerging technology known as Augmented Reality (AR) which is the environment where computer-generated 3D objects are blended (registered) onto a real world scene, to enhance the interaction of a user with the real world. The work in this thesis marries the two concepts above by proposing a new approach for immersive robot programming known as Robot Programming using AR (RPAR), where the user directly moves a virtual robot amongst real objects in an unknown environment. The first part of this work is the development of a new RPAR methodology aimed at applications such as pick-and-place tasks, where there are a number of possible path solutions for a given start and goal configuration pair. The ability of the user to make sense of an unknown environment is utilized by enabling him/her to intuitively define subsets of the free space that are relevant to his/her goals. Collision-free paths for userdefined start and goal configurations are then generated using a beam search strategy. The second part of this work is the development of a new RPAR methodology aimed at applications such as arc welding and laser cutting, where the end-effector of the robot is constrained to follow a user-defined 3D path at a certain orientation, consistently with respect to that path. The first step requires the user to perform a number of demonstrations to obtain data points of the path/curve to be followed. These data points are processed using a number of algorithms to produce an output curve that best represents the demonstrated curve, i.e., learning the human intention. The Piecewise Linear Parameterization (PLP) algorithm and a curve learning method based on Bayesian neural networks and reparameterization are proposed. After the output curve is generated, the user is required to interactively define the free space along the curve. The orientation of the end-effector can then be planned intuitively using visual feedback from AR, and a final collision-free path can be obtained. The two RPAR methodologies developed in this research have been applied to a number of case studies and the various related issues have been discussed. |
author2 |
Andrew Nee Yeh Ching |
author_facet |
Andrew Nee Yeh Ching Chong, Jonathan Wun Shiung |
format |
Theses and Dissertations |
author |
Chong, Jonathan Wun Shiung |
author_sort |
Chong, Jonathan Wun Shiung |
title |
Robot programming using augmented reality |
title_short |
Robot programming using augmented reality |
title_full |
Robot programming using augmented reality |
title_fullStr |
Robot programming using augmented reality |
title_full_unstemmed |
Robot programming using augmented reality |
title_sort |
robot programming using augmented reality |
publishDate |
2014 |
url |
https://hdl.handle.net/10356/60515 |
_version_ |
1688665686807674880 |