Online learning with kernels

Kernel methods are popular nonparametric modeling tools in machine learning. The Mercer kernel function is applied to perform transformation of feature vectors from a low dimensional space to a high or even infinite dimensional reproducing kernel Hilbert space (RKHS), where linear solutions can be f...

Full description

Saved in:
Bibliographic Details
Main Author: Fan, Haijin
Other Authors: Song Qing
Format: Theses and Dissertations
Language:English
Published: 2014
Subjects:
Online Access:https://hdl.handle.net/10356/61869
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Kernel methods are popular nonparametric modeling tools in machine learning. The Mercer kernel function is applied to perform transformation of feature vectors from a low dimensional space to a high or even infinite dimensional reproducing kernel Hilbert space (RKHS), where linear solutions can be found in the high dimensional feature space. Kernel methods have been widely applied in batch learning or off-line learning contexts for various applications. It is also a promising and interesting topic to extend the kernel methods to online learning. The challenges of kernel online learning algorithms are their computational complexities, generalization performance, convergence and stability issues. In this thesis, several kernel-based algorithms are thoroughly investigated for online learning. In online learning context, to curb the growing number of kernel functions and reduce the computational complexity of the kernel algorithms, sparsification methods are proposed. The sparsification method performs a role to select a compact dictionary consisting of the kernel function centers. They are developed according to different measures, such as the mutual information measure, the significance measure, the coherence measure or the cumulative coherence measure, which are used to evaluate the novelty of new training samples. On the other hand, to update the kernel weight, the traditional linear or nonlinear online learning algorithms are extended to RKHS to develop types of kernel online learning algorithms, such as the kernel recursive least square algorithms and the kernel (regularized) least mean square algorithms which incorporate suitable sparsification methods to select their kernel centers. The convergence and stability issues of the learning algorithms are guaranteed with adaptive training methods in terms of external disturbance under the Lyapunov stability theorem. A novel unified framework is also proposed for kernel online learning with adaptive kernels. In this framework, the kernel width is not set as a fixed value in the training process. Instead, it is considered as an additional free parameter and can be adapted automatically. The algorithm can adapt the kernel width according to the training data effectively under different initial kernel width and improve the performance of the algorithm in both testing accuracy and convergence speed. Furthermore, we first extend the recurrent model to kernel methods and propose a linear recurrent kernel algorithm for online learning. The algorithm introduces a linear recurrent term which is linearly related to the previous output. The recurrent term makes the past information reusable for the updating of the algorithm. To ensure the reuse of the recurrent information indeed accelerates the convergence speed, a hybrid training algorithm is proposed for the recurrent training of the algorithm with guaranteed convergence. The proposed algorithms are thoroughly tested on both artificial data and real-world datasets against several existing online learning algorithms. The results demonstrate that our algorithms are effective in both generalization performance and convergence speed, with a comparable or less computational cost.