Target detection and estimation using a stereo vision camera for autonomous navigation
Stereo based vision cameras are increasingly popular in terms of their usage in the commercial markets. They are relatively inexpensive and can perform an array of functions. Currently, there are many different types of stereo vision cameras available. Such cameras have been used to aid in the navig...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2009
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/17910 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-17910 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-179102019-12-10T12:08:48Z Target detection and estimation using a stereo vision camera for autonomous navigation Chow, Song Qian. Wijerupage Sardha Wijesoma School of Electrical and Electronic Engineering DRNTU::Engineering Stereo based vision cameras are increasingly popular in terms of their usage in the commercial markets. They are relatively inexpensive and can perform an array of functions. Currently, there are many different types of stereo vision cameras available. Such cameras have been used to aid in the navigation of autonomous vehicles like cars and kayaks. One of these functions which this project focuses on is to detect possible targets / objects in a controlled environment and to provide rough distance and direction estimation from the autonomous vehicle to the intended object. This function is exceptionally useful in real-time control of autonomous vehicles. The focus of this project’s experiment is to capture photographic images using Point Grey Stereo Vision Camera Bumblebee® 2. After these images have been captured, software programs like FlyCapture® SDK and Triclops® SDK, which are provided by Point Grey Research will be used to process the raw images. Subsequently, edge detection techniques are applied to these images to detect these objects. This data is captured and disparity maps and point cloud can be obtained. From these processed information, the depth or distance of the objects from the camera can be estimated. The next stage is to the Algorithm Processing Unit (APU) of the autonomous vehicle can then work out collision-avoidance solution so that it can navigate around these objects smoothly. Certainly, all these processes only form the initial stages for the vehicle navigation, and in later stages, other high-level functions like target engagement will be required to fulfill the different roles. This project concludes with a summary of the existing techniques used in edge detection of objects and generation of disparity maps. Lastly, there will be some recommendations for future development in this area of study. Bachelor of Engineering 2009-06-17T09:09:26Z 2009-06-17T09:09:26Z 2009 2009 Final Year Project (FYP) http://hdl.handle.net/10356/17910 en Nanyang Technological University 72 p. application/msword |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering |
spellingShingle |
DRNTU::Engineering Chow, Song Qian. Target detection and estimation using a stereo vision camera for autonomous navigation |
description |
Stereo based vision cameras are increasingly popular in terms of their usage in the commercial markets. They are relatively inexpensive and can perform an array of functions. Currently, there are many different types of stereo vision cameras available. Such cameras have been used to aid in the navigation of autonomous vehicles like cars and kayaks.
One of these functions which this project focuses on is to detect possible targets / objects in a controlled environment and to provide rough distance and direction estimation from the autonomous vehicle to the intended object. This function is exceptionally useful in real-time control of autonomous vehicles.
The focus of this project’s experiment is to capture photographic images using Point Grey Stereo Vision Camera Bumblebee® 2. After these images have been captured, software programs like FlyCapture® SDK and Triclops® SDK, which are provided by Point Grey Research will be used to process the raw images. Subsequently, edge detection techniques are applied to these images to detect these objects. This data is captured and disparity maps and point cloud can be obtained. From these processed information, the depth or distance of the objects from the camera can be estimated.
The next stage is to the Algorithm Processing Unit (APU) of the autonomous vehicle can then work out collision-avoidance solution so that it can navigate around these objects smoothly. Certainly, all these processes only form the initial stages for the vehicle navigation, and in later stages, other high-level functions like target engagement will be required to fulfill the different roles.
This project concludes with a summary of the existing techniques used in edge detection of objects and generation of disparity maps. Lastly, there will be some recommendations for future development in this area of study. |
author2 |
Wijerupage Sardha Wijesoma |
author_facet |
Wijerupage Sardha Wijesoma Chow, Song Qian. |
format |
Final Year Project |
author |
Chow, Song Qian. |
author_sort |
Chow, Song Qian. |
title |
Target detection and estimation using a stereo vision camera for autonomous navigation |
title_short |
Target detection and estimation using a stereo vision camera for autonomous navigation |
title_full |
Target detection and estimation using a stereo vision camera for autonomous navigation |
title_fullStr |
Target detection and estimation using a stereo vision camera for autonomous navigation |
title_full_unstemmed |
Target detection and estimation using a stereo vision camera for autonomous navigation |
title_sort |
target detection and estimation using a stereo vision camera for autonomous navigation |
publishDate |
2009 |
url |
http://hdl.handle.net/10356/17910 |
_version_ |
1681048170309615616 |