Learning under concept drift with follow the regularized leader and adaptive decaying proximal

Concept drift is the problem that the statistical properties of the data generating process change over time. Recently, the Time Decaying Adaptive Prediction (TDAP) algorithm1 was proposed to address the problem of concept drift. TDAP was designed to account for the effect of drifting concepts by di...

全面介紹

Saved in:
書目詳細資料
Main Authors: Huynh, Ngoc Anh, Ng, Wee Keong, Ariyapala, Kanishka
其他作者: School of Computer Science and Engineering
格式: Article
語言:English
出版: 2018
主題:
在線閱讀:https://hdl.handle.net/10356/87712
http://hdl.handle.net/10220/45494
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Concept drift is the problem that the statistical properties of the data generating process change over time. Recently, the Time Decaying Adaptive Prediction (TDAP) algorithm1 was proposed to address the problem of concept drift. TDAP was designed to account for the effect of drifting concepts by discounting the contribution of previous learning examples using an exponentially decaying factor. The drawback of TDAP is that the rate of its decaying factor is required to be manually tuned. To address this drawback, we propose a new adaptive online algorithm, called Follow-the-Regularized-Leader with Adaptive Decaying Proximal (FTRL-ADP). There are two novelties in our approach. First, we derive a rule to automatically update the decaying rate, based on a rigorous theoretical analysis. Second, we use a concept drift detector to identify major drifts and reset the update rule accordingly. Comparative experiments with 14 datasets and 6 other online algorithms show that FTRL-ADP is most advantageous in noisy environments with real drifts.