On the annotation of web videos by efficient near-duplicate search
With the proliferation of Web 2.0 applications, usersupplied social tags are commonly available in social media as a means to bridge the semantic gap. On the other hand, the explosive expansion of social web makes an overwhelming number of web videos available, among which there exists a large numbe...
Saved in:
Main Authors: | , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2010
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/6336 https://ink.library.smu.edu.sg/context/sis_research/article/7339/viewcontent/10.1.1.330.9344.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-7339 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-73392021-11-23T04:32:16Z On the annotation of web videos by efficient near-duplicate search ZHAO, Wan-Lei WU, Xiao NGO, Chong-wah With the proliferation of Web 2.0 applications, usersupplied social tags are commonly available in social media as a means to bridge the semantic gap. On the other hand, the explosive expansion of social web makes an overwhelming number of web videos available, among which there exists a large number of near-duplicate videos. In this paper, we investigate techniques which allow effective annotation of web videos from a data-driven perspective. A novel classifier-free video annotation framework is proposed by first retrieving visual duplicates and then suggesting representative tags. The significance of this paper lies in the addressing of two timely issues for annotating query videos. First, we provide a novel solution for fast near-duplicate video retrieval. Second, based on the outcome of near-duplicate search, we explore the potential that the data-driven annotation could be successful when huge volume of tagged web videos is freely accessible online. Experiments on cross sources (annotating Google videos and Yahoo! videos using YouTube videos) and cross time periods (annotating YouTube videos using historical data) show the effectiveness and efficiency of the proposed classifier-free approach for web video tag annotation. 2010-08-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/6336 info:doi/10.1109/TMM.2010.2050651 https://ink.library.smu.edu.sg/context/sis_research/article/7339/viewcontent/10.1.1.330.9344.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Data-driven near-duplicate video search video annotation web video Data Storage Systems Graphics and Human Computer Interfaces |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Data-driven near-duplicate video search video annotation web video Data Storage Systems Graphics and Human Computer Interfaces |
spellingShingle |
Data-driven near-duplicate video search video annotation web video Data Storage Systems Graphics and Human Computer Interfaces ZHAO, Wan-Lei WU, Xiao NGO, Chong-wah On the annotation of web videos by efficient near-duplicate search |
description |
With the proliferation of Web 2.0 applications, usersupplied social tags are commonly available in social media as a means to bridge the semantic gap. On the other hand, the explosive expansion of social web makes an overwhelming number of web videos available, among which there exists a large number of near-duplicate videos. In this paper, we investigate techniques which allow effective annotation of web videos from a data-driven perspective. A novel classifier-free video annotation framework is proposed by first retrieving visual duplicates and then suggesting representative tags. The significance of this paper lies in the addressing of two timely issues for annotating query videos. First, we provide a novel solution for fast near-duplicate video retrieval. Second, based on the outcome of near-duplicate search, we explore the potential that the data-driven annotation could be successful when huge volume of tagged web videos is freely accessible online. Experiments on cross sources (annotating Google videos and Yahoo! videos using YouTube videos) and cross time periods (annotating YouTube videos using historical data) show the effectiveness and efficiency of the proposed classifier-free approach for web video tag annotation. |
format |
text |
author |
ZHAO, Wan-Lei WU, Xiao NGO, Chong-wah |
author_facet |
ZHAO, Wan-Lei WU, Xiao NGO, Chong-wah |
author_sort |
ZHAO, Wan-Lei |
title |
On the annotation of web videos by efficient near-duplicate search |
title_short |
On the annotation of web videos by efficient near-duplicate search |
title_full |
On the annotation of web videos by efficient near-duplicate search |
title_fullStr |
On the annotation of web videos by efficient near-duplicate search |
title_full_unstemmed |
On the annotation of web videos by efficient near-duplicate search |
title_sort |
on the annotation of web videos by efficient near-duplicate search |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2010 |
url |
https://ink.library.smu.edu.sg/sis_research/6336 https://ink.library.smu.edu.sg/context/sis_research/article/7339/viewcontent/10.1.1.330.9344.pdf |
_version_ |
1770575937161658368 |