Event-guided structured output tracking of fast-moving objects using a CeleX sensor

In this paper, we propose an event-guided support vector machine (ESVM) for tracking high-speed moving objects. Tracking fast-moving objects with low frame rate cameras is always difficult due to motion blur and large displacements. The accuracy problem can be solved by using high frame rate cameras...

Full description

Saved in:
Bibliographic Details
Main Authors: Huang, Jing, Wang, Shizheng, Guo, Menghan, Chen, Shoushun
Other Authors: School of Electrical and Electronic Engineering
Format: Article
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/10356/142927
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-142927
record_format dspace
spelling sg-ntu-dr.10356-1429272020-07-14T01:29:04Z Event-guided structured output tracking of fast-moving objects using a CeleX sensor Huang, Jing Wang, Shizheng Guo, Menghan Chen, Shoushun School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Dynamic Motion Sensor Event Guided In this paper, we propose an event-guided support vector machine (ESVM) for tracking high-speed moving objects. Tracking fast-moving objects with low frame rate cameras is always difficult due to motion blur and large displacements. The accuracy problem can be solved by using high frame rate cameras at the expense of tremendous computational cost. For this issue, our ESVM incorporates event-based guiding methods into the traditional structured support vector machine to improve the tracking accuracy at a relatively low-complexity level. The event-based guiding methods include two models, event position guided search localization and event intensity guided sample supplement, which are based on the event features of the CeleX motion sensor. The motion sensor continuously responds to intensity change, which is generally related to object motion. Once it has detected intensity change, the motion sensor outputs event packages, and each of them contains the pixel location, time stamp, and pixel illumination. The generated events are continuous in the temporal domain and thus record the motion trajectory of fast-moving objects, which cannot be fully captured by frame-based cameras. In this paper, we convert high-speed test sequences into sequences of spiking events recorded by the CeleX motion sensor. Our approach presents fairly high computational efficiency, and experiments over sequences from multiple tracking benchmarks demonstrate the superior accuracy and real-time performance of our method, compared to the state-of-the-art trackers. 2020-07-14T01:29:04Z 2020-07-14T01:29:04Z 2018 Journal Article Huang, J., Wang, S., Guo, M., & Chen, S. (2018). Event-guided structured output tracking of fast-moving objects using a CeleX sensor. IEEE Transactions on Circuits and Systems for Video Technology, 28(9), 2413-2417. doi:10.1109/TCSVT.2018.2841516 1051-8215 https://hdl.handle.net/10356/142927 10.1109/TCSVT.2018.2841516 2-s2.0-85047606416 9 28 2413 2417 en IEEE Transactions on Circuits and Systems for Video Technology © 2018 IEEE. All rights reserved.
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
Dynamic Motion Sensor
Event Guided
spellingShingle Engineering::Electrical and electronic engineering
Dynamic Motion Sensor
Event Guided
Huang, Jing
Wang, Shizheng
Guo, Menghan
Chen, Shoushun
Event-guided structured output tracking of fast-moving objects using a CeleX sensor
description In this paper, we propose an event-guided support vector machine (ESVM) for tracking high-speed moving objects. Tracking fast-moving objects with low frame rate cameras is always difficult due to motion blur and large displacements. The accuracy problem can be solved by using high frame rate cameras at the expense of tremendous computational cost. For this issue, our ESVM incorporates event-based guiding methods into the traditional structured support vector machine to improve the tracking accuracy at a relatively low-complexity level. The event-based guiding methods include two models, event position guided search localization and event intensity guided sample supplement, which are based on the event features of the CeleX motion sensor. The motion sensor continuously responds to intensity change, which is generally related to object motion. Once it has detected intensity change, the motion sensor outputs event packages, and each of them contains the pixel location, time stamp, and pixel illumination. The generated events are continuous in the temporal domain and thus record the motion trajectory of fast-moving objects, which cannot be fully captured by frame-based cameras. In this paper, we convert high-speed test sequences into sequences of spiking events recorded by the CeleX motion sensor. Our approach presents fairly high computational efficiency, and experiments over sequences from multiple tracking benchmarks demonstrate the superior accuracy and real-time performance of our method, compared to the state-of-the-art trackers.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Huang, Jing
Wang, Shizheng
Guo, Menghan
Chen, Shoushun
format Article
author Huang, Jing
Wang, Shizheng
Guo, Menghan
Chen, Shoushun
author_sort Huang, Jing
title Event-guided structured output tracking of fast-moving objects using a CeleX sensor
title_short Event-guided structured output tracking of fast-moving objects using a CeleX sensor
title_full Event-guided structured output tracking of fast-moving objects using a CeleX sensor
title_fullStr Event-guided structured output tracking of fast-moving objects using a CeleX sensor
title_full_unstemmed Event-guided structured output tracking of fast-moving objects using a CeleX sensor
title_sort event-guided structured output tracking of fast-moving objects using a celex sensor
publishDate 2020
url https://hdl.handle.net/10356/142927
_version_ 1681056419943546880