Laplacian embedded regression for scalable manifold regularization

Semi-supervised learning (SSL), as a powerful tool to learn from a limited number of labeled data and a large number of unlabeled data, has been attracting increasing attention in the machine learning community. In particular, the manifold regularization framework has laid solid theoretical foundati...

Full description

Saved in:
Bibliographic Details
Main Authors: Chen, Lin, Tsang, Ivor Wai-Hung, Xu, Dong
Other Authors: School of Computer Engineering
Format: Article
Language:English
Published: 2013
Subjects:
Online Access:https://hdl.handle.net/10356/99250
http://hdl.handle.net/10220/13482
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-99250
record_format dspace
spelling sg-ntu-dr.10356-992502020-05-28T07:18:04Z Laplacian embedded regression for scalable manifold regularization Chen, Lin Tsang, Ivor Wai-Hung Xu, Dong School of Computer Engineering DRNTU::Engineering::Computer science and engineering Semi-supervised learning (SSL), as a powerful tool to learn from a limited number of labeled data and a large number of unlabeled data, has been attracting increasing attention in the machine learning community. In particular, the manifold regularization framework has laid solid theoretical foundations for a large family of SSL algorithms, such as Laplacian support vector machine (LapSVM) and Laplacian regularized least squares (LapRLS). However, most of these algorithms are limited to small scale problems due to the high computational cost of the matrix inversion operation involved in the optimization problem. In this paper, we propose a novel framework called Laplacian embedded regression by introducing an intermediate decision variable into the manifold regularization framework. By using ∈-insensitive loss, we obtain the Laplacian embedded support vector regression (LapESVR) algorithm, which inherits the sparse solution from SVR. Also, we derive Laplacian embedded RLS (LapERLS) corresponding to RLS under the proposed framework. Both LapESVR and LapERLS posses a simpler form of a transformed kernel, which is the summation of the original kernel and a graph kernel that captures the manifold structure. The benefits of the transformed kernel are two-fold: (1) we can deal with the original kernel matrix and the graph Laplacian matrix in the graph kernel separately and (2) if the graph Laplacian matrix is sparse, we only need to perform the inverse operation for a sparse matrix, which is much more efficient when compared with that for a dense one. Inspired by kernel principal component analysis, we further propose to project the introduced decision variable into a subspace spanned by a few eigenvectors of the graph Laplacian matrix in order to better reflect the data manifold, as well as accelerate the calculation of the graph kernel, allowing our methods to efficiently and effectively cope with large scale SSL problems. Extensive experiments on both toy and r- al world data sets show the effectiveness and scalability of the proposed framework. 2013-09-16T06:56:54Z 2019-12-06T20:05:03Z 2013-09-16T06:56:54Z 2019-12-06T20:05:03Z 2012 2012 Journal Article Chen, L., Tsang, I. W., & Xu, D. (2012). Laplacian Embedded Regression for Scalable Manifold Regularization. IEEE Transactions on Neural Networks and Learning Systems, 23(6), 902-915. 2162-237X https://hdl.handle.net/10356/99250 http://hdl.handle.net/10220/13482 10.1109/TNNLS.2012.2190420 en IEEE transactions on neural networks and learning systems © 2012 IEEE
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic DRNTU::Engineering::Computer science and engineering
spellingShingle DRNTU::Engineering::Computer science and engineering
Chen, Lin
Tsang, Ivor Wai-Hung
Xu, Dong
Laplacian embedded regression for scalable manifold regularization
description Semi-supervised learning (SSL), as a powerful tool to learn from a limited number of labeled data and a large number of unlabeled data, has been attracting increasing attention in the machine learning community. In particular, the manifold regularization framework has laid solid theoretical foundations for a large family of SSL algorithms, such as Laplacian support vector machine (LapSVM) and Laplacian regularized least squares (LapRLS). However, most of these algorithms are limited to small scale problems due to the high computational cost of the matrix inversion operation involved in the optimization problem. In this paper, we propose a novel framework called Laplacian embedded regression by introducing an intermediate decision variable into the manifold regularization framework. By using ∈-insensitive loss, we obtain the Laplacian embedded support vector regression (LapESVR) algorithm, which inherits the sparse solution from SVR. Also, we derive Laplacian embedded RLS (LapERLS) corresponding to RLS under the proposed framework. Both LapESVR and LapERLS posses a simpler form of a transformed kernel, which is the summation of the original kernel and a graph kernel that captures the manifold structure. The benefits of the transformed kernel are two-fold: (1) we can deal with the original kernel matrix and the graph Laplacian matrix in the graph kernel separately and (2) if the graph Laplacian matrix is sparse, we only need to perform the inverse operation for a sparse matrix, which is much more efficient when compared with that for a dense one. Inspired by kernel principal component analysis, we further propose to project the introduced decision variable into a subspace spanned by a few eigenvectors of the graph Laplacian matrix in order to better reflect the data manifold, as well as accelerate the calculation of the graph kernel, allowing our methods to efficiently and effectively cope with large scale SSL problems. Extensive experiments on both toy and r- al world data sets show the effectiveness and scalability of the proposed framework.
author2 School of Computer Engineering
author_facet School of Computer Engineering
Chen, Lin
Tsang, Ivor Wai-Hung
Xu, Dong
format Article
author Chen, Lin
Tsang, Ivor Wai-Hung
Xu, Dong
author_sort Chen, Lin
title Laplacian embedded regression for scalable manifold regularization
title_short Laplacian embedded regression for scalable manifold regularization
title_full Laplacian embedded regression for scalable manifold regularization
title_fullStr Laplacian embedded regression for scalable manifold regularization
title_full_unstemmed Laplacian embedded regression for scalable manifold regularization
title_sort laplacian embedded regression for scalable manifold regularization
publishDate 2013
url https://hdl.handle.net/10356/99250
http://hdl.handle.net/10220/13482
_version_ 1681059616527482880