Implementation of boundary detection for autonomous rescue robot

During times of a natural disaster, urban calamity or explosion, the post-disaster site is most likely to be unsafe, unreachable and strewn with debris and rubble. Such an area poses a threat to all rescue personnel that enter in search for survivors. A rescue robot meant solely for the purpose of e...

Full description

Saved in:
Bibliographic Details
Main Author: Maruvanda, Aiyappa Chengappa.
Other Authors: Wang Han
Format: Final Year Project
Language:English
Published: 2010
Subjects:
Online Access:http://hdl.handle.net/10356/40797
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-40797
record_format dspace
spelling sg-ntu-dr.10356-407972023-07-07T16:51:23Z Implementation of boundary detection for autonomous rescue robot Maruvanda, Aiyappa Chengappa. Wang Han School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics During times of a natural disaster, urban calamity or explosion, the post-disaster site is most likely to be unsafe, unreachable and strewn with debris and rubble. Such an area poses a threat to all rescue personnel that enter in search for survivors. A rescue robot meant solely for the purpose of exploring such territories would result in reduced personnel requirements and reduced fatigue while having the ability to reach inaccessible areas. It also allows rescue personnel to focus their efforts on specific areas marked by the robot, rather than spend time and energy in searching throughout the entire site. An autonomous rescue robot in unfamiliar terrain needs to be able to maintain its track on site and be aware of its proximity from the boundary edges. For these reasons, a boundary detection program is of importance to an autonomous rescue robot. This report explores and implements a boundary detection program for an autonomous robot on the assumption that a boundary is a colored line. This is done by making use of various image processing techniques: line, edge, and color and shape detection. This report carries out a comparative study to select the best and most appropriate methods for color, edge and line detection followed by which it gives a brief introduction of these methodologies that explain their working and theory. The implemented algorithm performs the image analysis on images taken in from a single Firefly camera device in the form of a video. Images are passed through color detection, line detection, and edge & shape detection algorithms. On completion of these procedures, the program is capable of tracking the boundary line. This report majorly looks into the results of these image analysis steps and discusses them while testing the accuracy, performance, efficiency and robust nature of the proposed program/algorithm. Bachelor of Engineering 2010-06-22T03:20:56Z 2010-06-22T03:20:56Z 2010 2010 Final Year Project (FYP) http://hdl.handle.net/10356/40797 en Nanyang Technological University 76 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
spellingShingle DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
Maruvanda, Aiyappa Chengappa.
Implementation of boundary detection for autonomous rescue robot
description During times of a natural disaster, urban calamity or explosion, the post-disaster site is most likely to be unsafe, unreachable and strewn with debris and rubble. Such an area poses a threat to all rescue personnel that enter in search for survivors. A rescue robot meant solely for the purpose of exploring such territories would result in reduced personnel requirements and reduced fatigue while having the ability to reach inaccessible areas. It also allows rescue personnel to focus their efforts on specific areas marked by the robot, rather than spend time and energy in searching throughout the entire site. An autonomous rescue robot in unfamiliar terrain needs to be able to maintain its track on site and be aware of its proximity from the boundary edges. For these reasons, a boundary detection program is of importance to an autonomous rescue robot. This report explores and implements a boundary detection program for an autonomous robot on the assumption that a boundary is a colored line. This is done by making use of various image processing techniques: line, edge, and color and shape detection. This report carries out a comparative study to select the best and most appropriate methods for color, edge and line detection followed by which it gives a brief introduction of these methodologies that explain their working and theory. The implemented algorithm performs the image analysis on images taken in from a single Firefly camera device in the form of a video. Images are passed through color detection, line detection, and edge & shape detection algorithms. On completion of these procedures, the program is capable of tracking the boundary line. This report majorly looks into the results of these image analysis steps and discusses them while testing the accuracy, performance, efficiency and robust nature of the proposed program/algorithm.
author2 Wang Han
author_facet Wang Han
Maruvanda, Aiyappa Chengappa.
format Final Year Project
author Maruvanda, Aiyappa Chengappa.
author_sort Maruvanda, Aiyappa Chengappa.
title Implementation of boundary detection for autonomous rescue robot
title_short Implementation of boundary detection for autonomous rescue robot
title_full Implementation of boundary detection for autonomous rescue robot
title_fullStr Implementation of boundary detection for autonomous rescue robot
title_full_unstemmed Implementation of boundary detection for autonomous rescue robot
title_sort implementation of boundary detection for autonomous rescue robot
publishDate 2010
url http://hdl.handle.net/10356/40797
_version_ 1772827909367529472