New insight into hybrid stochastic gradient descent: Beyond with-replacement sampling and convexity
As an incremental-gradient algorithm, the hybrid stochastic gradient descent (HSGD) enjoys merits of both stochastic and full gradient methods for finite-sum problem optimization. However, the existing rate-of-convergence analysis for HSGD is made under with-replacement sampling (WRS) and is restric...
Saved in:
Main Authors: | ZHOU, Pan, YUAN, Xiao-Tong, FENG, Jiashi |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2018
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9007 https://ink.library.smu.edu.sg/context/sis_research/article/10010/viewcontent/NeurIPS_2018_new_insight_into_hybrid_stochastic_gradient_descent_beyond_with_replacement_sampling_and_convexity_Paper.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Efficient stochastic gradient hard thresholding
by: ZHOU, Pan, et al.
Published: (2018) -
STOCHASTIC GRADIENT DESCENT AND ITS EXTENSIONS
by: ZHANG JINGWEI
Published: (2021) -
Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
by: ZHOU, Pan, et al.
Published: (2019) -
Byzantine-resilient decentralized stochastic gradient descent
by: GUO, Shangwei, et al.
Published: (2022) -
Byzantine-resilient decentralized stochastic gradient descent
by: Guo, Shangwei, et al.
Published: (2024)