Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization
Variance reduction techniques like SVRG provide simple and fast algorithms for optimizing a convex finite-sum objective. For nonconvex objectives, these techniques can also find a first-order stationary point (with small gradient). However, in nonconvex optimization it is often crucial to find a sec...
Saved in:
Main Authors: | GE, Rong, LI, Zhize, WANG, Weiyao, WANG, Xiang |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2019
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8677 https://ink.library.smu.edu.sg/context/sis_research/article/9680/viewcontent/COLT19_stabilizedsvrg.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
A simple proximal stochastic gradient method for nonsmooth nonconvex optimization
by: LI, Zhize, et al.
Published: (2018) -
PAGE: A simple and optimal probabilistic gradient estimator for nonconvex optimization
by: LI, Zhize, et al.
Published: (2021) -
Simple and optimal stochastic gradient methods for nonsmooth nonconvex optimization
by: LI, Zhize, et al.
Published: (2022) -
BEER: Fast O(1/T) rate for decentralized nonconvex optimization with communication compression
by: ZHAO, Haoyu, et al.
Published: (2022) -
DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
by: LI, Boyue, et al.
Published: (2022)