Visual odometry from single camera
Visual odometry (VO) is the process of estimating the motion of an object, with the input information from visual sensors, e.g., different models of cameras, GPS, LiDAR, etc. For instance, Monocular visual odometry adopts a single camera to estimate the local trajectory, up to a scale factor. As the...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/152041 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Visual odometry (VO) is the process of estimating the motion of an object, with the input information from visual sensors, e.g., different models of cameras, GPS, LiDAR, etc. For instance, Monocular visual odometry adopts a single camera to estimate the local trajectory, up to a scale factor. As the front-end stage of SLAM (Simultaneous Localization and Mapping) technology, visual odometry can be applied to varies of domains, including autonomous vehicle and augmented reality.
Monocular camera model is chosen as the visual sensor in this Visual Odometry project, for its accessibility and flexibility when setting up and processing data compared with stereo cameras or laser sensors. Additionally, as camera utilize visible spectrum, it will provide more visualized information from aspects of colors, shapes and textures etc. Regarding its wide applied scenarios in industry, cost control and device flexibility becomes more vital, therefore cameras take more advantages than IMU and LiDAR. As a consequence, monocular visual odometry will have a great potential to find a place in this era of artificial intelligence.
This report explains detailly on the main pipelines of monocular visual odometry system, as well as geometry applied in each stage, e.g., feature detectors and descriptors, feature matching and tracking, epipolar constraints, camera matrix and motion estimation etc. Furthermore, real-time function was implemented in this project, utilizing the external wireless connected phone camera. A 2D local trajectory from bird view is generated with the input merely from monocular perspective camera. |
---|