Efficient stochastic gradient hard thresholding
Stochastic gradient hard thresholding methods have recently been shown to work favorably in solving large-scale empirical risk minimization problems under sparsity or rank constraint. Despite the improved iteration complexity over full gradient methods, the gradient evaluation and hard thresholding...
Saved in:
Main Authors: | ZHOU, Pan, YUAN, Xiao-Tong, FENG, Jiashi |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2018
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9003 https://ink.library.smu.edu.sg/context/sis_research/article/10006/viewcontent/NeurIPS_2018_efficient_stochastic_gradient_hard_thresholding_Paper.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
New insight into hybrid stochastic gradient descent: Beyond with-replacement sampling and convexity
by: ZHOU, Pan, et al.
Published: (2018) -
Empirical risk landscape analysis for understanding deep neural networks
by: ZHOU, Pan, et al.
Published: (2018) -
Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning
by: SHANG, Fanhua, et al.
Published: (2021) -
A hybrid stochastic-deterministic minibatch proximal gradient method for efficient optimization and generalization
by: ZHOU, Pan, et al.
Published: (2021) -
Win: Weight-decay-integrated nesterov acceleration for adaptive gradient algorithms
by: ZHOU, Pan, et al.
Published: (2023)