Obstacles mapping based on 3-D perception for mobile robot navigation
Many previous researchers have offered two-dimensional mapping for robotic navigation. However, since two-dimensional mapping is only able to detect the barriers in planar fields, researchers are looking for other better ways to discover the obstacles in the spherical area. The disadvantage of two-d...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | http://umpir.ump.edu.my/id/eprint/30397/1/Obstacles%20mapping%20based%20on%203-D%20perception%20for%20mobile.pdf http://umpir.ump.edu.my/id/eprint/30397/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Malaysia Pahang |
Language: | English |
id |
my.ump.umpir.30397 |
---|---|
record_format |
eprints |
spelling |
my.ump.umpir.303972021-01-06T08:04:54Z http://umpir.ump.edu.my/id/eprint/30397/ Obstacles mapping based on 3-D perception for mobile robot navigation Achmad, M. S. Hendriyawan TK Electrical engineering. Electronics Nuclear engineering Many previous researchers have offered two-dimensional mapping for robotic navigation. However, since two-dimensional mapping is only able to detect the barriers in planar fields, researchers are looking for other better ways to discover the obstacles in the spherical area. The disadvantage of two-dimensional mapping for robot navigation is that it is unable to detect the barriers that have elevation differences. This research offers several steps in order to build a three-dimensional map. The first step is to develop the mobile robot as a test-bed platform. Robot projects the obstacles by measuring the distance uses depth camera to get obstacles geometry information in the form of point-cloud that show the position of landmarks on X, Y, and Z coordinate. The second step offers a method of estimating robot translation and rotation accurately using sensors fusion technique, which is a combination of wheel odometry, visual odometry, and inertial odometry. Wheel odometry estimates the position of the robot based on information on wheel rotation speed without being affected by the presence of light, magnetism, or gravity vectors, but wheel odometry has error accumulation issue. Visual odometry performs estimation functions based on visual images with the combination of Features from Accelerated Segment Test (FAST) and singular value decomposition (SVD) methods. However, visual odometry is very dependent on the presence of light and texture of the object, the less light and texture of the object, the higher the error of position estimation. Inertial odometry uses Magnetic-Angular-Gravity (MARG) measurement then combines the three measurements through the Madgwick method to produce accurate position estimation values. However, inertial odometry is only able to estimate rotational motion. This study offers a fusion method based on the Extended Kalman Filter (EKF) to produce a new estimation output that eliminates the weaknesses of each estimation result (wheel odometry, visual odometry, inertial odometry). The third step is the registration of three-dimensional map based on robot pose estimation and depth measurement. All these issues are examined and investigated from an estimation-theoretic perspective through mathematical analysis. The theories have been validated through experimental investigations. The results of position estimation test using multi-sensor fusion techniques based on the EKF method for 120 seconds in the area of 10m x 10m show the average value of X axis translation error of 7.6cm, Y axis translation error of 8.5cm, roll rotation error of 0.678○, pitch rotation error of 0.491○, and yaw rotation errors are 0.483○. The visual results show a 3-D map which successfully reconstructed has a minimal fracture or overlapping, and represent the same situation as the reality. 2020-07 Thesis NonPeerReviewed pdf en http://umpir.ump.edu.my/id/eprint/30397/1/Obstacles%20mapping%20based%20on%203-D%20perception%20for%20mobile.pdf Achmad, M. S. Hendriyawan (2020) Obstacles mapping based on 3-D perception for mobile robot navigation. PhD thesis, Universiti Malaysia Pahang. |
institution |
Universiti Malaysia Pahang |
building |
UMP Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Malaysia Pahang |
content_source |
UMP Institutional Repository |
url_provider |
http://umpir.ump.edu.my/ |
language |
English |
topic |
TK Electrical engineering. Electronics Nuclear engineering |
spellingShingle |
TK Electrical engineering. Electronics Nuclear engineering Achmad, M. S. Hendriyawan Obstacles mapping based on 3-D perception for mobile robot navigation |
description |
Many previous researchers have offered two-dimensional mapping for robotic navigation. However, since two-dimensional mapping is only able to detect the barriers in planar fields, researchers are looking for other better ways to discover the obstacles in the spherical area. The disadvantage of two-dimensional mapping for robot navigation is that it is unable to detect the barriers that have elevation differences. This research offers several steps in order to build a three-dimensional map. The first step is to develop the mobile robot as a test-bed platform. Robot projects the obstacles by measuring the distance uses depth camera to get obstacles geometry information in the form of point-cloud that show the position of landmarks on X, Y, and Z coordinate. The second step offers a method of estimating robot translation and rotation accurately using sensors fusion technique, which is a combination of wheel odometry, visual odometry, and inertial odometry. Wheel odometry estimates the position of the robot based on information on wheel rotation speed without being affected by the presence of light, magnetism, or gravity vectors, but wheel odometry has error accumulation issue. Visual odometry performs estimation functions based on visual images with the combination of Features from Accelerated Segment Test (FAST) and singular value decomposition (SVD) methods. However, visual odometry is very dependent on the presence of light and texture of the object, the less light and texture of the object, the higher the error of position estimation. Inertial odometry uses Magnetic-Angular-Gravity (MARG) measurement then combines the three measurements through the Madgwick method to produce accurate position estimation values. However, inertial odometry is only able to estimate rotational motion. This study offers a fusion method based on the Extended Kalman Filter (EKF) to produce a new estimation output that eliminates the weaknesses of each estimation result (wheel odometry, visual odometry, inertial odometry). The third step is the registration of three-dimensional map based on robot pose estimation and depth measurement. All these issues are examined and investigated from an estimation-theoretic perspective through mathematical analysis. The theories have been validated through experimental investigations. The results of position estimation test using multi-sensor fusion techniques based on the EKF method for 120 seconds in the area of 10m x 10m show the average value of X axis translation error of 7.6cm, Y axis translation error of 8.5cm, roll rotation error of 0.678○, pitch rotation error of 0.491○, and yaw rotation errors are 0.483○. The visual results show a 3-D map which successfully reconstructed has a minimal fracture or overlapping, and represent the same situation as the reality. |
format |
Thesis |
author |
Achmad, M. S. Hendriyawan |
author_facet |
Achmad, M. S. Hendriyawan |
author_sort |
Achmad, M. S. Hendriyawan |
title |
Obstacles mapping based on 3-D perception for mobile robot navigation |
title_short |
Obstacles mapping based on 3-D perception for mobile robot navigation |
title_full |
Obstacles mapping based on 3-D perception for mobile robot navigation |
title_fullStr |
Obstacles mapping based on 3-D perception for mobile robot navigation |
title_full_unstemmed |
Obstacles mapping based on 3-D perception for mobile robot navigation |
title_sort |
obstacles mapping based on 3-d perception for mobile robot navigation |
publishDate |
2020 |
url |
http://umpir.ump.edu.my/id/eprint/30397/1/Obstacles%20mapping%20based%20on%203-D%20perception%20for%20mobile.pdf http://umpir.ump.edu.my/id/eprint/30397/ |
_version_ |
1688548106786832384 |