Unsupervised Multiple Kernel Learning

Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an...

Full description

Saved in:
Bibliographic Details
Main Authors: ZHUANG, Jinfeng, WANG, Jialei, HOI, Steven C. H., LAN, Xiangyang
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2011
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/2291
https://ink.library.smu.edu.sg/context/sis_research/article/3291/viewcontent/Unsupervised_Multiple_Kernel_Learning.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-3291
record_format dspace
spelling sg-smu-ink.sis_research-32912016-01-13T06:54:16Z Unsupervised Multiple Kernel Learning ZHUANG, Jinfeng WANG, Jialei HOI, Steven C. H. LAN, Xiangyang Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimension reduction. In this paper, we investigate a problem of Unsupervised Multiple Kernel Learning (UMKL), which does not require class labels of training data as needed in a conventional multiple kernel learning task. Since a kernel essentially defines pairwise similarity between any two examples, our unsupervised kernel learning method mainly follows two intuitive principles: (1) a good kernel should allow every example to be well reconstructed from its localized bases weighted by the kernel values; (2) a good kernel should induce kernel values that are coincided with the local geometry of the data. We formulate the unsupervised multiple kernel learning problem as an optimization task and propose an efficient alternating optimization algorithm to solve it. Empirical results on both classification and dimension reductions tasks validate the efficacy of the proposed UMKL algorithm. 2011-11-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/2291 https://ink.library.smu.edu.sg/context/sis_research/article/3291/viewcontent/Unsupervised_Multiple_Kernel_Learning.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Computer Sciences Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Computer Sciences
Databases and Information Systems
spellingShingle Computer Sciences
Databases and Information Systems
ZHUANG, Jinfeng
WANG, Jialei
HOI, Steven C. H.
LAN, Xiangyang
Unsupervised Multiple Kernel Learning
description Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimension reduction. In this paper, we investigate a problem of Unsupervised Multiple Kernel Learning (UMKL), which does not require class labels of training data as needed in a conventional multiple kernel learning task. Since a kernel essentially defines pairwise similarity between any two examples, our unsupervised kernel learning method mainly follows two intuitive principles: (1) a good kernel should allow every example to be well reconstructed from its localized bases weighted by the kernel values; (2) a good kernel should induce kernel values that are coincided with the local geometry of the data. We formulate the unsupervised multiple kernel learning problem as an optimization task and propose an efficient alternating optimization algorithm to solve it. Empirical results on both classification and dimension reductions tasks validate the efficacy of the proposed UMKL algorithm.
format text
author ZHUANG, Jinfeng
WANG, Jialei
HOI, Steven C. H.
LAN, Xiangyang
author_facet ZHUANG, Jinfeng
WANG, Jialei
HOI, Steven C. H.
LAN, Xiangyang
author_sort ZHUANG, Jinfeng
title Unsupervised Multiple Kernel Learning
title_short Unsupervised Multiple Kernel Learning
title_full Unsupervised Multiple Kernel Learning
title_fullStr Unsupervised Multiple Kernel Learning
title_full_unstemmed Unsupervised Multiple Kernel Learning
title_sort unsupervised multiple kernel learning
publisher Institutional Knowledge at Singapore Management University
publishDate 2011
url https://ink.library.smu.edu.sg/sis_research/2291
https://ink.library.smu.edu.sg/context/sis_research/article/3291/viewcontent/Unsupervised_Multiple_Kernel_Learning.pdf
_version_ 1770572074617667584