Robust regularized Kernel regression

Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formula...

Full description

Saved in:
Bibliographic Details
Main Authors: ZHU, Jianke, HOI, Steven C. H., LYU, Michael R.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2008
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/2316
https://ink.library.smu.edu.sg/context/sis_research/article/3316/viewcontent/RobustRegularized_2008.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-3316
record_format dspace
spelling sg-smu-ink.sis_research-33162020-04-01T02:11:46Z Robust regularized Kernel regression ZHU, Jianke HOI, Steven C. H. LYU, Michael R. Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches. 2008-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/2316 info:doi/10.1109/TSMCB.2008.927279 https://ink.library.smu.edu.sg/context/sis_research/article/3316/viewcontent/RobustRegularized_2008.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Kernel regression regularized least squares (RLS) robust estimator support vector machine (SVM) Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Kernel regression
regularized least squares (RLS)
robust estimator
support vector machine (SVM)
Databases and Information Systems
spellingShingle Kernel regression
regularized least squares (RLS)
robust estimator
support vector machine (SVM)
Databases and Information Systems
ZHU, Jianke
HOI, Steven C. H.
LYU, Michael R.
Robust regularized Kernel regression
description Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches.
format text
author ZHU, Jianke
HOI, Steven C. H.
LYU, Michael R.
author_facet ZHU, Jianke
HOI, Steven C. H.
LYU, Michael R.
author_sort ZHU, Jianke
title Robust regularized Kernel regression
title_short Robust regularized Kernel regression
title_full Robust regularized Kernel regression
title_fullStr Robust regularized Kernel regression
title_full_unstemmed Robust regularized Kernel regression
title_sort robust regularized kernel regression
publisher Institutional Knowledge at Singapore Management University
publishDate 2008
url https://ink.library.smu.edu.sg/sis_research/2316
https://ink.library.smu.edu.sg/context/sis_research/article/3316/viewcontent/RobustRegularized_2008.pdf
_version_ 1770572096063143936