Efficient mining of multiple partial near-duplicate alignments by temporal network

This paper considers the mining and localization of near-duplicate segments at arbitrary positions of partial near-duplicate videos in a corpus. Temporal network is proposed to model the visual-temporal consistency between video sequence by embedding temporal constraints as directed edges in the net...

Full description

Saved in:
Bibliographic Details
Main Authors: TAN, Hung-Khoon, NGO, Chong-wah, CHUA, Tat-Seng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2010
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/6319
https://ink.library.smu.edu.sg/context/sis_research/article/7322/viewcontent/csvt_hktan_10.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-7322
record_format dspace
spelling sg-smu-ink.sis_research-73222021-11-23T05:12:23Z Efficient mining of multiple partial near-duplicate alignments by temporal network TAN, Hung-Khoon NGO, Chong-wah CHUA, Tat-Seng This paper considers the mining and localization of near-duplicate segments at arbitrary positions of partial near-duplicate videos in a corpus. Temporal network is proposed to model the visual-temporal consistency between video sequence by embedding temporal constraints as directed edges in the network. Partial alignment is then achieved through network flow programming. To handle multiple alignments, we consider two properties of network structure: conciseness and divisibility, to ensure that the mining is efficient and effective. Frame-level matching is further integrated in the temporal network for alignment verification. This results in an iterative alignment-verification procedure to fine tune the localization of near-duplicate segments. The scalability of frame-level matching is enhanced by exploring visual keyword matching algorithms. We demonstrate the proposed work for mining partial alignments from two months of broadcast videos and across six TV sources. 2010-11-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/6319 info:doi/10.1109/TCSVT.2010.2077531 https://ink.library.smu.edu.sg/context/sis_research/article/7322/viewcontent/csvt_hktan_10.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Keyword matching Partial near-duplicate temporal graph Graphics and Human Computer Interfaces OS and Networks
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Keyword matching
Partial near-duplicate
temporal graph
Graphics and Human Computer Interfaces
OS and Networks
spellingShingle Keyword matching
Partial near-duplicate
temporal graph
Graphics and Human Computer Interfaces
OS and Networks
TAN, Hung-Khoon
NGO, Chong-wah
CHUA, Tat-Seng
Efficient mining of multiple partial near-duplicate alignments by temporal network
description This paper considers the mining and localization of near-duplicate segments at arbitrary positions of partial near-duplicate videos in a corpus. Temporal network is proposed to model the visual-temporal consistency between video sequence by embedding temporal constraints as directed edges in the network. Partial alignment is then achieved through network flow programming. To handle multiple alignments, we consider two properties of network structure: conciseness and divisibility, to ensure that the mining is efficient and effective. Frame-level matching is further integrated in the temporal network for alignment verification. This results in an iterative alignment-verification procedure to fine tune the localization of near-duplicate segments. The scalability of frame-level matching is enhanced by exploring visual keyword matching algorithms. We demonstrate the proposed work for mining partial alignments from two months of broadcast videos and across six TV sources.
format text
author TAN, Hung-Khoon
NGO, Chong-wah
CHUA, Tat-Seng
author_facet TAN, Hung-Khoon
NGO, Chong-wah
CHUA, Tat-Seng
author_sort TAN, Hung-Khoon
title Efficient mining of multiple partial near-duplicate alignments by temporal network
title_short Efficient mining of multiple partial near-duplicate alignments by temporal network
title_full Efficient mining of multiple partial near-duplicate alignments by temporal network
title_fullStr Efficient mining of multiple partial near-duplicate alignments by temporal network
title_full_unstemmed Efficient mining of multiple partial near-duplicate alignments by temporal network
title_sort efficient mining of multiple partial near-duplicate alignments by temporal network
publisher Institutional Knowledge at Singapore Management University
publishDate 2010
url https://ink.library.smu.edu.sg/sis_research/6319
https://ink.library.smu.edu.sg/context/sis_research/article/7322/viewcontent/csvt_hktan_10.pdf
_version_ 1770575933569236992