Online multi-modal distance learning for scalable multimedia retrieval

In many real-word scenarios, e.g., multimedia applications, data often originates from multiple heterogeneous sources or are represented by diverse types of representation, which is often referred to as "multi-modal data". The definition of distance between any two objects/items on multi-m...

Full description

Saved in:
Bibliographic Details
Main Authors: XIA, Hao, WU, Pengcheng, HOI, Steven C. H.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2013
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/2337
https://ink.library.smu.edu.sg/context/sis_research/article/3337/viewcontent/Online_Multi_modal_Distance_Learning_for_Scalable_Multimedia_Retrieval.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:In many real-word scenarios, e.g., multimedia applications, data often originates from multiple heterogeneous sources or are represented by diverse types of representation, which is often referred to as "multi-modal data". The definition of distance between any two objects/items on multi-modal data is a key challenge encountered by many real-world applications, including multimedia retrieval. In this paper, we present a novel online learning framework for learning distance functions on multi-modal data through the combination of multiple kernels. In order to attack large-scale multimedia applications, we propose Online Multi-modal Distance Learning (OMDL) algorithms, which are significantly more efficient and scalable than the state-of-the-art techniques. We conducted an extensive set of experiments on multi-modal image retrieval applications, in which encouraging results validate the efficacy of the proposed technique