Large scale online kernel learning

In this paper, we present a new framework for large scale online kernel learning, making kernel methods efficient and scalable for large-scale online learning applications. Unlike the regular budget online kernel learning scheme that usually uses some budget maintenance strategies to bound the numbe...

Full description

Saved in:
Bibliographic Details
Main Authors: LU, Jing, HOI, Steven C. H., WANG, Jialei, ZHAO, Peilin, LIU, Zhi-Yong
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2016
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/3410
https://ink.library.smu.edu.sg/context/sis_research/article/4411/viewcontent/Largescaleonlinekernellearning.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-4411
record_format dspace
spelling sg-smu-ink.sis_research-44112018-03-08T05:21:51Z Large scale online kernel learning LU, Jing HOI, Steven C. H., WANG, Jialei ZHAO, Peilin LIU, Zhi-Yong In this paper, we present a new framework for large scale online kernel learning, making kernel methods efficient and scalable for large-scale online learning applications. Unlike the regular budget online kernel learning scheme that usually uses some budget maintenance strategies to bound the number of support vectors, our framework explores a completely different approach of kernel functional approximation techniques to make the subsequent online learning task efficient and scalable. Specifically, we present two different online kernel machine learning algorithms: (i) Fourier Online Gradient Descent (FOGD) algorithm that applies the random Fourier features for approximating kernel functions; and (ii) Nyström Online Gradient Descent (NOGD) algorithm that applies the Nyström method to approximate large kernel matrices. We explore these two approaches to tackle three online learning tasks: binary classification, multi-class classification, and regression. The encouraging results of our experiments on large-scale datasets validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget online kernel learning approaches. 2016-04-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/3410 https://ink.library.smu.edu.sg/context/sis_research/article/4411/viewcontent/Largescaleonlinekernellearning.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University online learning kernel method large scale machine learning Computer Sciences Databases and Information Systems Theory and Algorithms
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic online learning
kernel method
large scale machine learning
Computer Sciences
Databases and Information Systems
Theory and Algorithms
spellingShingle online learning
kernel method
large scale machine learning
Computer Sciences
Databases and Information Systems
Theory and Algorithms
LU, Jing
HOI, Steven C. H.,
WANG, Jialei
ZHAO, Peilin
LIU, Zhi-Yong
Large scale online kernel learning
description In this paper, we present a new framework for large scale online kernel learning, making kernel methods efficient and scalable for large-scale online learning applications. Unlike the regular budget online kernel learning scheme that usually uses some budget maintenance strategies to bound the number of support vectors, our framework explores a completely different approach of kernel functional approximation techniques to make the subsequent online learning task efficient and scalable. Specifically, we present two different online kernel machine learning algorithms: (i) Fourier Online Gradient Descent (FOGD) algorithm that applies the random Fourier features for approximating kernel functions; and (ii) Nyström Online Gradient Descent (NOGD) algorithm that applies the Nyström method to approximate large kernel matrices. We explore these two approaches to tackle three online learning tasks: binary classification, multi-class classification, and regression. The encouraging results of our experiments on large-scale datasets validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget online kernel learning approaches.
format text
author LU, Jing
HOI, Steven C. H.,
WANG, Jialei
ZHAO, Peilin
LIU, Zhi-Yong
author_facet LU, Jing
HOI, Steven C. H.,
WANG, Jialei
ZHAO, Peilin
LIU, Zhi-Yong
author_sort LU, Jing
title Large scale online kernel learning
title_short Large scale online kernel learning
title_full Large scale online kernel learning
title_fullStr Large scale online kernel learning
title_full_unstemmed Large scale online kernel learning
title_sort large scale online kernel learning
publisher Institutional Knowledge at Singapore Management University
publishDate 2016
url https://ink.library.smu.edu.sg/sis_research/3410
https://ink.library.smu.edu.sg/context/sis_research/article/4411/viewcontent/Largescaleonlinekernellearning.pdf
_version_ 1770573192093499392