Real-time object tracking for event cameras

Object tracking is a fundamental task engaged in many cutting-edge applications, e.g. auto-driving and surveillance. The recently developed event camera brings new possibilities to solve inherent challenges in frame-based object tracking, such as deformation and re-scale problem, background clutter...

全面介紹

Saved in:
書目詳細資料
主要作者: Zhang, Yexin
其他作者: Goh Wang Ling
格式: Thesis-Master by Research
語言:English
出版: Nanyang Technological University 2020
主題:
在線閱讀:https://hdl.handle.net/10356/137297
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
id sg-ntu-dr.10356-137297
record_format dspace
spelling sg-ntu-dr.10356-1372972023-07-04T17:15:18Z Real-time object tracking for event cameras Zhang, Yexin Goh Wang Ling School of Electrical and Electronic Engineering ewlgoh@ntu.edu.sg Engineering::Electrical and electronic engineering Object tracking is a fundamental task engaged in many cutting-edge applications, e.g. auto-driving and surveillance. The recently developed event camera brings new possibilities to solve inherent challenges in frame-based object tracking, such as deformation and re-scale problem, background clutter and motion blur. Instead of synchronized frames, event cameras record motion as an asynchronous event stream ={( , , )} in an ultra-high temporal resolution more than 1M Hz. In this work, a tracking framework is proposed to track a single object in a recursive manner aligned with the variate event rate. Event data are modeled as spacetime event clouds and fed to an adapted PointNet architecture to extract spatial and temporal information of the target object. Furthermore, the proposed framework is capable to process events in a continuous and recursive manner in real-time and generates event-wise bounding boxes to form a best-fit and smooth bounding volume over time. Master of Engineering 2020-03-16T09:01:28Z 2020-03-16T09:01:28Z 2019 Thesis-Master by Research Zhang, Y. (2019). Real-time object tracking for event cameras. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/137297 10.32657/10356/137297 en This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Zhang, Yexin
Real-time object tracking for event cameras
description Object tracking is a fundamental task engaged in many cutting-edge applications, e.g. auto-driving and surveillance. The recently developed event camera brings new possibilities to solve inherent challenges in frame-based object tracking, such as deformation and re-scale problem, background clutter and motion blur. Instead of synchronized frames, event cameras record motion as an asynchronous event stream ={( , , )} in an ultra-high temporal resolution more than 1M Hz. In this work, a tracking framework is proposed to track a single object in a recursive manner aligned with the variate event rate. Event data are modeled as spacetime event clouds and fed to an adapted PointNet architecture to extract spatial and temporal information of the target object. Furthermore, the proposed framework is capable to process events in a continuous and recursive manner in real-time and generates event-wise bounding boxes to form a best-fit and smooth bounding volume over time.
author2 Goh Wang Ling
author_facet Goh Wang Ling
Zhang, Yexin
format Thesis-Master by Research
author Zhang, Yexin
author_sort Zhang, Yexin
title Real-time object tracking for event cameras
title_short Real-time object tracking for event cameras
title_full Real-time object tracking for event cameras
title_fullStr Real-time object tracking for event cameras
title_full_unstemmed Real-time object tracking for event cameras
title_sort real-time object tracking for event cameras
publisher Nanyang Technological University
publishDate 2020
url https://hdl.handle.net/10356/137297
_version_ 1772826055202045952