Robust Object Tracking via Locality Sensitive Histograms

This paper presents a novel locality sensitive histogram (LSH) algorithm for visual tracking. Unlike the conventional image histogram that counts the frequency of occurrence of each intensity value by adding ones to the corresponding bin, an LSH is computed at each pixel location, and a floating-poi...

Full description

Saved in:
Bibliographic Details
Main Authors: HE, Shengfeng, LAU, Rynson W.H, YANG, Qingxiong, WANG, Jiang, YANG, Ming-Hsuan
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2017
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8364
https://ink.library.smu.edu.sg/context/sis_research/article/9367/viewcontent/Robust_Object_Tracking_via_Locality_Sensitive_Histograms.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:This paper presents a novel locality sensitive histogram (LSH) algorithm for visual tracking. Unlike the conventional image histogram that counts the frequency of occurrence of each intensity value by adding ones to the corresponding bin, an LSH is computed at each pixel location, and a floating-point value is added to the corresponding bin for each occurrence of an intensity value. The floating-point value exponentially reduces with respect to the distance to the pixel location where the histogram is computed. An efficient algorithm is proposed that enables the LSHs to be computed in time linear in the image size and the number of bins. In addition, this efficient algorithm can be extended to exploit color images. A robust tracking framework based on the LSHs is proposed, which consists of two main components: a new feature for tracking that is robust to illumination change and a novel multiregion tracking algorithm that runs in real time even with hundreds of regions. Extensive experiments demonstrate that the proposed tracking framework outperforms the state-of-the-art methods in challenging scenarios, especially when the illumination changes dramatically. Evaluation using the latest benchmark shows that our algorithm is the top performer.