Online object detection and tracking
The report introduces the real-time object detection and tracking by using HOG feature extraction, mean-shift tracking and k-NN classification methods. Detail explanation on each method used will be done in the subsequent chapters. Furthermore, MatLab is used as programming platform for this trackin...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2013
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/54220 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | The report introduces the real-time object detection and tracking by using HOG feature extraction, mean-shift tracking and k-NN classification methods. Detail explanation on each method used will be done in the subsequent chapters. Furthermore, MatLab is used as programming platform for this tracking system.
The tracking system is specifically designed to detect the motion of vehicle in a video. There are 5 stages in the tracking system, determination of vehicle’s coordinate (done manually), database generation, mean-shift coordination, determination of accuracy and tracking error correction.
The tracking system is initialized by manually determine the X and Y coordinate of the car from the first frame of the video. By using HOG feature extraction algorithm, a database containing positive (vehicle’s features) and negative (background features) histograms (540 bins) is generated.
During the tracking stage, the vehicle’s coordinate is predicted using mean-shift algorithm. Further analysis and error corrections are done with k-NN classifier algorithm to increase the accuracy of the tracking system. The tracking process is then repeated until the last frame of the video.
Finally, the full process of the tracking system is observed and evaluated. The recommended solutions to improve the efficiency of the tracking system are also provided in the report for future research. |
---|