Large scale kernel methods for online AUC maximization
Learning to optimize AUC performance for classifying label imbalanced data in online scenarios has been extensively studied in recent years. Most of the existing work has attempted to address the problem directly in the original feature space, which may not suitable for non-linearly separable datase...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2017
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/3969 https://ink.library.smu.edu.sg/context/sis_research/article/4971/viewcontent/6._Large_Scale_Kernel_Methods_for_Online_AUC__ICDM2017_.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-4971 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-49712020-04-30T03:15:38Z Large scale kernel methods for online AUC maximization DING, Yi LIU, Chenghao ZHAO, Peilin HOI, Steven C. H. Learning to optimize AUC performance for classifying label imbalanced data in online scenarios has been extensively studied in recent years. Most of the existing work has attempted to address the problem directly in the original feature space, which may not suitable for non-linearly separable datasets. To solve this issue, some kernel-based learning methods are proposed for non-linearly separable datasets. However, such kernel approaches have been shown to be inefficient and failed to scale well on large scale datasets in practice. Taking this cue, in this work, we explore the use of scalable kernel-based learning techniques as surrogates to existing approaches: random Fourier features and Nyström method, for tackling the problem and bring insights to the differences between the two methods based on their online performance. In contrast to the conventional kernel-based learning methods which suffer from high computational complexity of the kernel matrix, our proposed approaches elevate this issue with linear features that approximate the kernel function/matrix. Specifically, two different surrogate kernel-based learning models are presented for addressing the online AUC maximization task: (i) the Fourier Online AUC Maximization (FOAM) algorithm that samples the basis functions from a data-independent distribution to approximate the kernel functions; and (ii) the Nyström Online AUC Maximization (NOAM) algorithm that samples a subset of instances from the training data to approximate the kernel matrix by a low rank matrix. Another novelty of the present work is the proposed mini-batch Online Gradient Descent method for model updating to control the noise and reduce the variance of gradients. We provide theoretical analyses for the two proposed algorithms. Empirical studies on commonly used large scale datasets show that the proposed algorithms outperformed existing state-of-the-art methods in terms of both AUC performance and computational efficiency. 2017-11-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/3969 info:doi/10.1109/ICDM.2017.18 https://ink.library.smu.edu.sg/context/sis_research/article/4971/viewcontent/6._Large_Scale_Kernel_Methods_for_Online_AUC__ICDM2017_.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Kernel Computational modeling Approximation algorithms Machine learning algorithms Support vector machines Learning systems Computer Sciences Databases and Information Systems Theory and Algorithms |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Kernel Computational modeling Approximation algorithms Machine learning algorithms Support vector machines Learning systems Computer Sciences Databases and Information Systems Theory and Algorithms |
spellingShingle |
Kernel Computational modeling Approximation algorithms Machine learning algorithms Support vector machines Learning systems Computer Sciences Databases and Information Systems Theory and Algorithms DING, Yi LIU, Chenghao ZHAO, Peilin HOI, Steven C. H. Large scale kernel methods for online AUC maximization |
description |
Learning to optimize AUC performance for classifying label imbalanced data in online scenarios has been extensively studied in recent years. Most of the existing work has attempted to address the problem directly in the original feature space, which may not suitable for non-linearly separable datasets. To solve this issue, some kernel-based learning methods are proposed for non-linearly separable datasets. However, such kernel approaches have been shown to be inefficient and failed to scale well on large scale datasets in practice. Taking this cue, in this work, we explore the use of scalable kernel-based learning techniques as surrogates to existing approaches: random Fourier features and Nyström method, for tackling the problem and bring insights to the differences between the two methods based on their online performance. In contrast to the conventional kernel-based learning methods which suffer from high computational complexity of the kernel matrix, our proposed approaches elevate this issue with linear features that approximate the kernel function/matrix. Specifically, two different surrogate kernel-based learning models are presented for addressing the online AUC maximization task: (i) the Fourier Online AUC Maximization (FOAM) algorithm that samples the basis functions from a data-independent distribution to approximate the kernel functions; and (ii) the Nyström Online AUC Maximization (NOAM) algorithm that samples a subset of instances from the training data to approximate the kernel matrix by a low rank matrix. Another novelty of the present work is the proposed mini-batch Online Gradient Descent method for model updating to control the noise and reduce the variance of gradients. We provide theoretical analyses for the two proposed algorithms. Empirical studies on commonly used large scale datasets show that the proposed algorithms outperformed existing state-of-the-art methods in terms of both AUC performance and computational efficiency. |
format |
text |
author |
DING, Yi LIU, Chenghao ZHAO, Peilin HOI, Steven C. H. |
author_facet |
DING, Yi LIU, Chenghao ZHAO, Peilin HOI, Steven C. H. |
author_sort |
DING, Yi |
title |
Large scale kernel methods for online AUC maximization |
title_short |
Large scale kernel methods for online AUC maximization |
title_full |
Large scale kernel methods for online AUC maximization |
title_fullStr |
Large scale kernel methods for online AUC maximization |
title_full_unstemmed |
Large scale kernel methods for online AUC maximization |
title_sort |
large scale kernel methods for online auc maximization |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2017 |
url |
https://ink.library.smu.edu.sg/sis_research/3969 https://ink.library.smu.edu.sg/context/sis_research/article/4971/viewcontent/6._Large_Scale_Kernel_Methods_for_Online_AUC__ICDM2017_.pdf |
_version_ |
1770574092167020544 |