A hybrid stochastic-deterministic minibatch proximal gradient method for efficient optimization and generalization
Despite the success of stochastic variance-reduced gradient (SVRG) algorithms in solving large-scale problems, their stochastic gradient complexity often scales linearly with data size and is expensive for huge data. Accordingly, we propose a hybrid stochastic-deterministic minibatch proximal gradie...
Saved in:
Main Authors: | ZHOU, Pan, YUAN, Xiao-Tong, LIN Zhouchen, HOI, Steven C. H. |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2021
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8979 https://ink.library.smu.edu.sg/context/sis_research/article/9982/viewcontent/2021_TPAMI_HSDN.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
by: ZHOU, Pan, et al.
Published: (2019) -
Randomized gradient-free distributed online optimization via a dynamic regret analysis
by: Pang, Yipeng, et al.
Published: (2023) -
A fast algorithm for convex hull extraction in 2D images
by: Ye, Q.-Z.
Published: (2014) -
An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP
by: Jiang, K., et al.
Published: (2014) -
Inequalities on the variances of convex functions of random variables
by: See, C.-T., et al.
Published: (2016)