Fast bounded online gradient descent algorithms for scalable kernel-based online learning
Kernel-based online learning has often shown state-of-the-art performance for many online learning tasks. It, however, suffers from a major shortcoming, that is, the unbounded number of support vectors, making it non-scalable and unsuitable for applications with large-scale datasets. In this work, w...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2012
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/2342 https://ink.library.smu.edu.sg/context/sis_research/article/3342/viewcontent/Fast_Bounded_Online_Gradient_Descent_Algorithms_for_Scalable_Kernel_Based_Online_Learning.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-3342 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-33422020-04-02T07:11:13Z Fast bounded online gradient descent algorithms for scalable kernel-based online learning ZHAO, Peilin WANG, Jialei WU, Pengcheng JIN, Rong HOI, Steven C. H. Kernel-based online learning has often shown state-of-the-art performance for many online learning tasks. It, however, suffers from a major shortcoming, that is, the unbounded number of support vectors, making it non-scalable and unsuitable for applications with large-scale datasets. In this work, we study the problem of bounded kernel-based online learning that aims to constrain the number of support vectors by a predefined budget. Although several algorithms have been proposed in literature, they are neither computationally efficient due to their intensive budget maintenance strategy nor effective due to the use of simple Perceptron algorithm. To overcome these limitations, we propose a framework for bounded kernel-based online learning based on an online gradient descent approach. We propose two efficient algorithms of bounded online gradient descent (BOGD) for scalable kernel-based online learning: (i) BOGD by maintaining support vectors using uniform sampling, and (ii) BOGD++ by maintaining support vectors using non-uniform sampling. We present theoretical analysis of regret bound for both algorithms, and found promising empirical performance in terms of both efficacy and efficiency by comparing them to several well-known algorithms for bounded kernel-based online learning on large-scale datasets. 2012-07-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/2342 https://ink.library.smu.edu.sg/context/sis_research/article/3342/viewcontent/Fast_Bounded_Online_Gradient_Descent_Algorithms_for_Scalable_Kernel_Based_Online_Learning.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Computationally efficient Empirical performance Gradient descent Gradient descent algorithms Large-scale datasets Computer Sciences Databases and Information Systems Theory and Algorithms |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Computationally efficient Empirical performance Gradient descent Gradient descent algorithms Large-scale datasets Computer Sciences Databases and Information Systems Theory and Algorithms |
spellingShingle |
Computationally efficient Empirical performance Gradient descent Gradient descent algorithms Large-scale datasets Computer Sciences Databases and Information Systems Theory and Algorithms ZHAO, Peilin WANG, Jialei WU, Pengcheng JIN, Rong HOI, Steven C. H. Fast bounded online gradient descent algorithms for scalable kernel-based online learning |
description |
Kernel-based online learning has often shown state-of-the-art performance for many online learning tasks. It, however, suffers from a major shortcoming, that is, the unbounded number of support vectors, making it non-scalable and unsuitable for applications with large-scale datasets. In this work, we study the problem of bounded kernel-based online learning that aims to constrain the number of support vectors by a predefined budget. Although several algorithms have been proposed in literature, they are neither computationally efficient due to their intensive budget maintenance strategy nor effective due to the use of simple Perceptron algorithm. To overcome these limitations, we propose a framework for bounded kernel-based online learning based on an online gradient descent approach. We propose two efficient algorithms of bounded online gradient descent (BOGD) for scalable kernel-based online learning: (i) BOGD by maintaining support vectors using uniform sampling, and (ii) BOGD++ by maintaining support vectors using non-uniform sampling. We present theoretical analysis of regret bound for both algorithms, and found promising empirical performance in terms of both efficacy and efficiency by comparing them to several well-known algorithms for bounded kernel-based online learning on large-scale datasets. |
format |
text |
author |
ZHAO, Peilin WANG, Jialei WU, Pengcheng JIN, Rong HOI, Steven C. H. |
author_facet |
ZHAO, Peilin WANG, Jialei WU, Pengcheng JIN, Rong HOI, Steven C. H. |
author_sort |
ZHAO, Peilin |
title |
Fast bounded online gradient descent algorithms for scalable kernel-based online learning |
title_short |
Fast bounded online gradient descent algorithms for scalable kernel-based online learning |
title_full |
Fast bounded online gradient descent algorithms for scalable kernel-based online learning |
title_fullStr |
Fast bounded online gradient descent algorithms for scalable kernel-based online learning |
title_full_unstemmed |
Fast bounded online gradient descent algorithms for scalable kernel-based online learning |
title_sort |
fast bounded online gradient descent algorithms for scalable kernel-based online learning |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2012 |
url |
https://ink.library.smu.edu.sg/sis_research/2342 https://ink.library.smu.edu.sg/context/sis_research/article/3342/viewcontent/Fast_Bounded_Online_Gradient_Descent_Algorithms_for_Scalable_Kernel_Based_Online_Learning.pdf |
_version_ |
1770572103999815680 |