Intelligent navigation of mobile robots in a crowded environment

As technology advances, robots have integrated into our daily lives in many different forms like robot vacuum cleaners, self-driving cars, and robot food servers. In order for them to move around safely, a good navigation algorithm has to be implemented. Various methods of navigation have been propo...

Full description

Saved in:
Bibliographic Details
Main Author: Lim, Shervina Qi Wen
Other Authors: Soong Boon Hee
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2020
Subjects:
Online Access:https://hdl.handle.net/10356/139312
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:As technology advances, robots have integrated into our daily lives in many different forms like robot vacuum cleaners, self-driving cars, and robot food servers. In order for them to move around safely, a good navigation algorithm has to be implemented. Various methods of navigation have been proposed over the years and one of the most widely used methods of autonomous navigation in mobile robot is the vision-based approach. Using cameras to track and predict pedestrian motion and LiDARs to localise the robot, autonomous navigation can be achieved. However, there are limitations of this approach since cameras and LiDARs have a limited viewing angle depending on the hardware chosen, resulting some blind spots. If a person approaches the robot at its blind spot, the robot will be unable to detect the person and in the worst-case scenario, a collision will occur. This poses a greater problem when the robot is placed in a highly crowded environment. The proposed solution to this problem is to integrate tactile input into local path planners to allow the robot to “sense” the surroundings. In this project, four different local path planners are studied in detail. By comparing the overall performance of each planner in simulation, the best performing local planner, EBand local planner was chosen. To allow the robot to move autonomously, the omni-directional robot was configured according to the Robot Operating System (ROS) navigation stack. ROS navigation stack provides a medium for easy implementation of different robot navigation algorithms. It also adopts the layered costmaps which is useful for the integration of tactile input to local planning algorithms. Lastly, different types of tactile sensors like force torque sensors and load cells are tested based on the frequency of sensor readings. Tactile sensors are used since they pick up the magnitude and spatial information of the contact force, the robot will then be able to determine the exact location of the collision and the speed of the person. By combining tactile sensors into existing set up, the robot would be able to create path to avoid the obstacle. This can be done by adding a costmap layer. This costmap layer will convey the information of the sensor and place an obstacle on the point of collision where the local planner will generate a new path to move to the goal.