Tracking a person based on RGB-D data from a mobile robot
In the recent years, there has been an increase in interest in the field of human detection, due to its importance in many real-life application such as surveillance and in areas of interactions between robot and humans. To understand the human behaviors in different scenarios, the ability to be...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2017
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/70680 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-70680 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-706802023-07-07T16:48:02Z Tracking a person based on RGB-D data from a mobile robot Lim, Ying Teoh Eam Khwang School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering In the recent years, there has been an increase in interest in the field of human detection, due to its importance in many real-life application such as surveillance and in areas of interactions between robot and humans. To understand the human behaviors in different scenarios, the ability to be able to continuous track a person is the next step to do in the field of research. In order to keep up with the fast movements in human tracking, there is a need to find a suitable method to improve the performance and robustness of the detection system where having additional features such as the depth information will further enhance the efficiency. As such, this project will be exploring the use of RGB-D (RGB-Depth) information taken from the RGB-D sensors in aiding the robots to detect and follow the robot while it is moving. One of the drawbacks of vision based tracking methods is that the small errors from a single frame will be accumulated over a long run, as in each frame the background images are not segmented and will be tracked too. With RGB-D sensors such as Kinects, there is the advantage of a big depth difference between human and the environment, and hence based on the depth difference, it is easier to segment the human from its surrounding background. With RGB-D information, it is now possible to reduce the small shift errors from each single frame by removing the background and re-centering the tracking window based on the segmentation results. The first part of the project is the pre-processing of the images into point cloud data, by combining the depth image and RGB image from the dataset. The image will be transformed into a 3D point cloud in which the human and the ground plane are clearly seen to be differentiated. Based on the spatial relationship on the consecutive frames of the target, the candidate of the target person will be selected. Following which, both the target person and the whole 3D point cloud are being fed into the tracking algorithm and the tracking will be performed and compared with two different motion tracking algorithm. Bachelor of Engineering 2017-05-09T05:11:30Z 2017-05-09T05:11:30Z 2017 Final Year Project (FYP) http://hdl.handle.net/10356/70680 en Nanyang Technological University 94 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Electrical and electronic engineering |
spellingShingle |
DRNTU::Engineering::Electrical and electronic engineering Lim, Ying Tracking a person based on RGB-D data from a mobile robot |
description |
In the recent years, there has been an increase in interest in the field of human detection,
due to its importance in many real-life application such as surveillance and in areas of
interactions between robot and humans. To understand the human behaviors in different
scenarios, the ability to be able to continuous track a person is the next step to do in the
field of research. In order to keep up with the fast movements in human tracking, there is
a need to find a suitable method to improve the performance and robustness of the
detection system where having additional features such as the depth information will
further enhance the efficiency. As such, this project will be exploring the use of RGB-D
(RGB-Depth) information taken from the RGB-D sensors in aiding the robots to detect
and follow the robot while it is moving.
One of the drawbacks of vision based tracking methods is that the small errors from a
single frame will be accumulated over a long run, as in each frame the background images
are not segmented and will be tracked too. With RGB-D sensors such as Kinects, there is
the advantage of a big depth difference between human and the environment, and hence
based on the depth difference, it is easier to segment the human from its surrounding
background. With RGB-D information, it is now possible to reduce the small shift errors
from each single frame by removing the background and re-centering the tracking window
based on the segmentation results.
The first part of the project is the pre-processing of the images into point cloud data, by
combining the depth image and RGB image from the dataset. The image will be
transformed into a 3D point cloud in which the human and the ground plane are clearly
seen to be differentiated. Based on the spatial relationship on the consecutive frames of
the target, the candidate of the target person will be selected. Following which, both the
target person and the whole 3D point cloud are being fed into the tracking algorithm and
the tracking will be performed and compared with two different motion tracking algorithm. |
author2 |
Teoh Eam Khwang |
author_facet |
Teoh Eam Khwang Lim, Ying |
format |
Final Year Project |
author |
Lim, Ying |
author_sort |
Lim, Ying |
title |
Tracking a person based on RGB-D data from a mobile robot |
title_short |
Tracking a person based on RGB-D data from a mobile robot |
title_full |
Tracking a person based on RGB-D data from a mobile robot |
title_fullStr |
Tracking a person based on RGB-D data from a mobile robot |
title_full_unstemmed |
Tracking a person based on RGB-D data from a mobile robot |
title_sort |
tracking a person based on rgb-d data from a mobile robot |
publishDate |
2017 |
url |
http://hdl.handle.net/10356/70680 |
_version_ |
1772826832477880320 |