Video event detection using motion relativity and visual relatedness

Event detection plays an essential role in video content analysis. However, the existing features are still weak in event detection because: i) most features just capture what is involved in an event or how the event evolves separately, and thus cannot completely describe the event; ii) to capture e...

Full description

Saved in:
Bibliographic Details
Main Authors: WANG, Feng, JIANG, Yu-Gang, NGO, Chong-wah
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2008
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/6538
https://ink.library.smu.edu.sg/context/sis_research/article/7541/viewcontent/1459359.1459392.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-7541
record_format dspace
spelling sg-smu-ink.sis_research-75412022-01-10T03:45:15Z Video event detection using motion relativity and visual relatedness WANG, Feng JIANG, Yu-Gang NGO, Chong-wah Event detection plays an essential role in video content analysis. However, the existing features are still weak in event detection because: i) most features just capture what is involved in an event or how the event evolves separately, and thus cannot completely describe the event; ii) to capture event evolution information, only motion distribution over the whole frame is used which proves to be noisy in unconstrained videos; iii) the estimated object motion is usually distorted by camera movement. To cope with these problems, in this paper, we propose a new motion feature, namely Expanded Relative Motion Histogram of Bag-ofVisual-Words (ERMH-BoW) to employ motion relativity and visual relatedness for event detection. In ERMH-BoW, by representing what aspect of an event with Bag-of-VisualWords (BoW), we construct relative motion histograms between visual words to depict the object activities or how aspect of the event. ERMH-BoW thus integrates both what and how aspects for a complete event description. Instead of motion distribution features, local motion of visual words is employed which is more discriminative in event detection. Meanwhile, we show that by employing relative motion, ERMH-BoW is able to honestly describe object activities in an event regardless of varying camera movement. Besides, to alleviate the visual word correlation problem in BoW, we propose a novel method to expand the relative motion histogram. The expansion is achieved by diffusing the relative motion among correlated visual words measured by visual relatedness. To validate the effectiveness of the proposed feature, ERMH-BoW is used to measure video clip similarity with Earth Mover’s Distance (EMD) for event detection. We conduct experiments for detecting LSCOM events in TRECVID 2005 video corpus, and performance is improved by 74% and 24% compared with existing motion distribution feature and BoW feature respectively. 2008-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/6538 info:doi/10.1145/1459359.1459392 https://ink.library.smu.edu.sg/context/sis_research/article/7541/viewcontent/1459359.1459392.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Motion relativity Video event detection Visual relatedness Artificial Intelligence and Robotics Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Motion relativity
Video event detection
Visual relatedness
Artificial Intelligence and Robotics
Graphics and Human Computer Interfaces
spellingShingle Motion relativity
Video event detection
Visual relatedness
Artificial Intelligence and Robotics
Graphics and Human Computer Interfaces
WANG, Feng
JIANG, Yu-Gang
NGO, Chong-wah
Video event detection using motion relativity and visual relatedness
description Event detection plays an essential role in video content analysis. However, the existing features are still weak in event detection because: i) most features just capture what is involved in an event or how the event evolves separately, and thus cannot completely describe the event; ii) to capture event evolution information, only motion distribution over the whole frame is used which proves to be noisy in unconstrained videos; iii) the estimated object motion is usually distorted by camera movement. To cope with these problems, in this paper, we propose a new motion feature, namely Expanded Relative Motion Histogram of Bag-ofVisual-Words (ERMH-BoW) to employ motion relativity and visual relatedness for event detection. In ERMH-BoW, by representing what aspect of an event with Bag-of-VisualWords (BoW), we construct relative motion histograms between visual words to depict the object activities or how aspect of the event. ERMH-BoW thus integrates both what and how aspects for a complete event description. Instead of motion distribution features, local motion of visual words is employed which is more discriminative in event detection. Meanwhile, we show that by employing relative motion, ERMH-BoW is able to honestly describe object activities in an event regardless of varying camera movement. Besides, to alleviate the visual word correlation problem in BoW, we propose a novel method to expand the relative motion histogram. The expansion is achieved by diffusing the relative motion among correlated visual words measured by visual relatedness. To validate the effectiveness of the proposed feature, ERMH-BoW is used to measure video clip similarity with Earth Mover’s Distance (EMD) for event detection. We conduct experiments for detecting LSCOM events in TRECVID 2005 video corpus, and performance is improved by 74% and 24% compared with existing motion distribution feature and BoW feature respectively.
format text
author WANG, Feng
JIANG, Yu-Gang
NGO, Chong-wah
author_facet WANG, Feng
JIANG, Yu-Gang
NGO, Chong-wah
author_sort WANG, Feng
title Video event detection using motion relativity and visual relatedness
title_short Video event detection using motion relativity and visual relatedness
title_full Video event detection using motion relativity and visual relatedness
title_fullStr Video event detection using motion relativity and visual relatedness
title_full_unstemmed Video event detection using motion relativity and visual relatedness
title_sort video event detection using motion relativity and visual relatedness
publisher Institutional Knowledge at Singapore Management University
publishDate 2008
url https://ink.library.smu.edu.sg/sis_research/6538
https://ink.library.smu.edu.sg/context/sis_research/article/7541/viewcontent/1459359.1459392.pdf
_version_ 1770575983693266944