Learning under concept drift with follow the regularized leader and adaptive decaying proximal

Concept drift is the problem that the statistical properties of the data generating process change over time. Recently, the Time Decaying Adaptive Prediction (TDAP) algorithm1 was proposed to address the problem of concept drift. TDAP was designed to account for the effect of drifting concepts by di...

Full description

Saved in:
Bibliographic Details
Main Authors: Huynh, Ngoc Anh, Ng, Wee Keong, Ariyapala, Kanishka
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2018
Subjects:
Online Access:https://hdl.handle.net/10356/87712
http://hdl.handle.net/10220/45494
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Concept drift is the problem that the statistical properties of the data generating process change over time. Recently, the Time Decaying Adaptive Prediction (TDAP) algorithm1 was proposed to address the problem of concept drift. TDAP was designed to account for the effect of drifting concepts by discounting the contribution of previous learning examples using an exponentially decaying factor. The drawback of TDAP is that the rate of its decaying factor is required to be manually tuned. To address this drawback, we propose a new adaptive online algorithm, called Follow-the-Regularized-Leader with Adaptive Decaying Proximal (FTRL-ADP). There are two novelties in our approach. First, we derive a rule to automatically update the decaying rate, based on a rigorous theoretical analysis. Second, we use a concept drift detector to identify major drifts and reset the update rule accordingly. Comparative experiments with 14 datasets and 6 other online algorithms show that FTRL-ADP is most advantageous in noisy environments with real drifts.