Hybrid stochastic-deterministic minibatch proximal gradient: Less-than-single-pass optimization with nearly optimal generalization

Stochastic variance-reduced gradient (SVRG) algorithms have been shown to work favorably in solving large-scale learning problems. Despite the remarkable success, the stochastic gradient complexity of SVRG-type algorithms usually scales linearly with data size and thus could still be expensive for h...

全面介紹

Saved in:
書目詳細資料
Main Authors: ZHOU, Pan, YUAN, Xiaotong
格式: text
語言:English
出版: Institutional Knowledge at Singapore Management University 2020
主題:
在線閱讀:https://ink.library.smu.edu.sg/sis_research/9030
https://ink.library.smu.edu.sg/context/sis_research/article/10033/viewcontent/2020_ICML_HSDN.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Stochastic variance-reduced gradient (SVRG) algorithms have been shown to work favorably in solving large-scale learning problems. Despite the remarkable success, the stochastic gradient complexity of SVRG-type algorithms usually scales linearly with data size and thus could still be expensive for huge data. To address this deficiency, we propose a hybrid stochastic-deterministic minibatch proximal gradient (HSDMPG) algorithm for strongly-convex problems that enjoys provably improved data-size-independent complexity guarantees.