Real-time object tracking for event cameras

Object tracking is a fundamental task engaged in many cutting-edge applications, e.g. auto-driving and surveillance. The recently developed event camera brings new possibilities to solve inherent challenges in frame-based object tracking, such as deformation and re-scale problem, background clutter...

全面介紹

Saved in:
書目詳細資料
主要作者: Zhang, Yexin
其他作者: Goh Wang Ling
格式: Thesis-Master by Research
語言:English
出版: Nanyang Technological University 2020
主題:
在線閱讀:https://hdl.handle.net/10356/137297
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Object tracking is a fundamental task engaged in many cutting-edge applications, e.g. auto-driving and surveillance. The recently developed event camera brings new possibilities to solve inherent challenges in frame-based object tracking, such as deformation and re-scale problem, background clutter and motion blur. Instead of synchronized frames, event cameras record motion as an asynchronous event stream ={( , , )} in an ultra-high temporal resolution more than 1M Hz. In this work, a tracking framework is proposed to track a single object in a recursive manner aligned with the variate event rate. Event data are modeled as spacetime event clouds and fed to an adapted PointNet architecture to extract spatial and temporal information of the target object. Furthermore, the proposed framework is capable to process events in a continuous and recursive manner in real-time and generates event-wise bounding boxes to form a best-fit and smooth bounding volume over time.