Cross-Validation in Nonparametric Regression with Outliers

A popular data-driven method for choosing the bandwidth in standard kernel regression is cross-validation. Even when there are outliers ill the data, robust kernel regression can be used to estimate the unknown regression curve [Robust and Nonlinear Time Series Analysis. Lecture Notes in Statist. (1...

Full description

Saved in:
Bibliographic Details
Main Author: Leung, Denis H. Y.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2005
Subjects:
Online Access:https://ink.library.smu.edu.sg/soe_research/127
https://ink.library.smu.edu.sg/context/soe_research/article/1126/viewcontent/euclid.aos.1132936564.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:A popular data-driven method for choosing the bandwidth in standard kernel regression is cross-validation. Even when there are outliers ill the data, robust kernel regression can be used to estimate the unknown regression curve [Robust and Nonlinear Time Series Analysis. Lecture Notes in Statist. (1984) 26 163-184]. However, Under these Circumstances Standard cross-validation is no longer a satisfactory bandwidth selector because it is unduly influenced by extreme prediction errors caused by the existence of these Outliers. A more robust method proposed here is a cross-validation method that discounts the extreme prediction errors. In large samples the robust method chooses consistent bandwidths, and the consistency of the method is practically independent of the form ill which extreme prediction errors are discounted. Additionally, evaluation of the method's finite sample behavior in a simulation demonstrates that the proposed method performs favorably. This method call also be applied to other problems, for example, model selection, that require cross-validation.