Scalable online kernel learning

One critical deficiency of traditional online kernel learning methods is their increasing and unbounded number of support vectors (SV’s), making them inefficient and non-scalable for large-scale applications. Recent studies on budget online learning have attempted to overcome this shortcoming by bou...

Full description

Saved in:
Bibliographic Details
Main Author: LU, Jing
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2017
Subjects:
Online Access:https://ink.library.smu.edu.sg/etd_coll/142
https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=1141&context=etd_coll
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.etd_coll-1141
record_format dspace
spelling sg-smu-ink.etd_coll-11412018-05-09T00:49:26Z Scalable online kernel learning LU, Jing One critical deficiency of traditional online kernel learning methods is their increasing and unbounded number of support vectors (SV’s), making them inefficient and non-scalable for large-scale applications. Recent studies on budget online learning have attempted to overcome this shortcoming by bounding the number of SV’s. Despite being extensively studied, budget algorithms usually suffer from several drawbacks.First of all, although existing algorithms attempt to bound the number of SV’s at each iteration, most of them fail to bound the number of SV’s for the final averaged classifier, which is commonly used for online-to-batch conversion. To solve this problem, we propose a novel bounded online kernel learning method, Sparse Passive Aggressive learning (SPA), which is able to yield a final output kernel-based hypothesis with a bounded number of support vectors. The idea is to attain the bounded number of SV’s using an efficient stochastic sampling strategy which samples an incoming training example as a new SV with a probability proportional to its loss suffered. Since the SV’s are added wisely and no SV’s are removed during thelearning, the proposed SPA algorithm achieves a bounded final averaged classifier. We theoretically prove that the proposed SPA algorithm achieves an optimal regret bound in expectation, and empirically show that the new algorithm outperforms various bounded kernel-based online learning algorithms.Secondly, existing budget learning algorithms are either too simple to achieve satisfactory approximation accuracy, or too computationally intensive to scale for large datasets. To overcome this limitation and make online kernel learning efficient and scalable, we explore two functional approximation based online kernelmachine learning algorithms, Fourier Online Gradient Descent (FOGD) and Nystr¨om Online Gradient Descent (NOGD). The main idea is to adopt the two methods to approximate the kernel model with a linear classifier, so that the efficiency is highly improved. The encouraging results of our experiments on large-scale datasets validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget online kernel learning approachesThirdly, we also extend the proposed algorithms to solve the online multiple kernel learning (OMKL) problem, in which the goal is to significantly reduce the learning complexity of Online Multiple Kernel Classification (OMKC) while achieving satisfactory accuracy as compared with existing unbounded OMKC algorithms. Wetheoretically prove that the proposed algorithms achieve an optimal regret bound, and empirically show that the new algorithms outperform various bounded kernel-based online learning algorithms.In conclusion, this work presents several novel solutions for scaling up online kernel learning algorithms for large scale applications. To facilitate other researchers, we also provide an open source software tool-box that includes the implementation of all proposed algorithms in this paper, as well as many state-of-the-art online kernel learning algorithms. 2017-11-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/etd_coll/142 https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=1141&context=etd_coll http://creativecommons.org/licenses/by-nc-nd/4.0/ Dissertations and Theses Collection (Open Access) eng Institutional Knowledge at Singapore Management University machine learning online learning kernel learning classification kernel approximation scalable learning Databases and Information Systems Theory and Algorithms
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic machine learning
online learning
kernel learning
classification
kernel approximation
scalable learning
Databases and Information Systems
Theory and Algorithms
spellingShingle machine learning
online learning
kernel learning
classification
kernel approximation
scalable learning
Databases and Information Systems
Theory and Algorithms
LU, Jing
Scalable online kernel learning
description One critical deficiency of traditional online kernel learning methods is their increasing and unbounded number of support vectors (SV’s), making them inefficient and non-scalable for large-scale applications. Recent studies on budget online learning have attempted to overcome this shortcoming by bounding the number of SV’s. Despite being extensively studied, budget algorithms usually suffer from several drawbacks.First of all, although existing algorithms attempt to bound the number of SV’s at each iteration, most of them fail to bound the number of SV’s for the final averaged classifier, which is commonly used for online-to-batch conversion. To solve this problem, we propose a novel bounded online kernel learning method, Sparse Passive Aggressive learning (SPA), which is able to yield a final output kernel-based hypothesis with a bounded number of support vectors. The idea is to attain the bounded number of SV’s using an efficient stochastic sampling strategy which samples an incoming training example as a new SV with a probability proportional to its loss suffered. Since the SV’s are added wisely and no SV’s are removed during thelearning, the proposed SPA algorithm achieves a bounded final averaged classifier. We theoretically prove that the proposed SPA algorithm achieves an optimal regret bound in expectation, and empirically show that the new algorithm outperforms various bounded kernel-based online learning algorithms.Secondly, existing budget learning algorithms are either too simple to achieve satisfactory approximation accuracy, or too computationally intensive to scale for large datasets. To overcome this limitation and make online kernel learning efficient and scalable, we explore two functional approximation based online kernelmachine learning algorithms, Fourier Online Gradient Descent (FOGD) and Nystr¨om Online Gradient Descent (NOGD). The main idea is to adopt the two methods to approximate the kernel model with a linear classifier, so that the efficiency is highly improved. The encouraging results of our experiments on large-scale datasets validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget online kernel learning approachesThirdly, we also extend the proposed algorithms to solve the online multiple kernel learning (OMKL) problem, in which the goal is to significantly reduce the learning complexity of Online Multiple Kernel Classification (OMKC) while achieving satisfactory accuracy as compared with existing unbounded OMKC algorithms. Wetheoretically prove that the proposed algorithms achieve an optimal regret bound, and empirically show that the new algorithms outperform various bounded kernel-based online learning algorithms.In conclusion, this work presents several novel solutions for scaling up online kernel learning algorithms for large scale applications. To facilitate other researchers, we also provide an open source software tool-box that includes the implementation of all proposed algorithms in this paper, as well as many state-of-the-art online kernel learning algorithms.
format text
author LU, Jing
author_facet LU, Jing
author_sort LU, Jing
title Scalable online kernel learning
title_short Scalable online kernel learning
title_full Scalable online kernel learning
title_fullStr Scalable online kernel learning
title_full_unstemmed Scalable online kernel learning
title_sort scalable online kernel learning
publisher Institutional Knowledge at Singapore Management University
publishDate 2017
url https://ink.library.smu.edu.sg/etd_coll/142
https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=1141&context=etd_coll
_version_ 1712300892456222720