Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning
Recently, stochastic hard thresholding (HT) optimization methods [e.g., stochastic variance reduced gradient hard thresholding (SVRGHT)] are becoming more attractive for solving large-scale sparsity/rank-constrained problems. However, they have much higher HT oracle complexities, especially for high...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2021
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9049 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-10052 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-100522024-07-25T07:00:04Z Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning SHANG, Fanhua WEI, Bingkun LIU, Hongying LIU, Yuanyuan ZHOU, Pan GONG, Maoguo Recently, stochastic hard thresholding (HT) optimization methods [e.g., stochastic variance reduced gradient hard thresholding (SVRGHT)] are becoming more attractive for solving large-scale sparsity/rank-constrained problems. However, they have much higher HT oracle complexities, especially for high-dimensional data or large-scale matrices. To address this issue and inspired by the well-known Gradient Support Pursuit (GraSP) method, this article proposes a new Relaxed Gradient Support Pursuit (RGraSP) framework. Unlike GraSP, RGraSP only requires to yield an approximation solution at each iteration. Based on the property of RGraSP, we also present an efficient stochastic variance reduction-gradient support pursuit algorithm and its fast version (called stochastic variance reduced gradient support pursuit (SVRGSP+). We prove that the gradient oracle complexity of both our algorithms is two times less than that of SVRGHT. In particular, their HT complexity is about κsˆ times less than that of SVRGHT, where κsˆ is the restricted condition number. Moreover, we prove that our algorithms enjoy fast linear convergence to an approximately global optimum, and also present an asynchronous parallel variant to deal with very high-dimensional and sparse data. Experimental results on both synthetic and real-world datasets show that our algorithms yield superior results than the state-of-the-art gradient HT methods. 2021-06-23T07:00:00Z text https://ink.library.smu.edu.sg/sis_research/9049 info:doi/10.1109/TNNLS.2021.3087805 Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Complexity theory Stochastic processes Convergence Indexes Estimation Approximation algorithms Sparse matrices Hard thresholding (HT) sparse learning sparsity rank-constrained problem stochastic optimization variance reduction OS and Networks |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Complexity theory Stochastic processes Convergence Indexes Estimation Approximation algorithms Sparse matrices Hard thresholding (HT) sparse learning sparsity rank-constrained problem stochastic optimization variance reduction OS and Networks |
spellingShingle |
Complexity theory Stochastic processes Convergence Indexes Estimation Approximation algorithms Sparse matrices Hard thresholding (HT) sparse learning sparsity rank-constrained problem stochastic optimization variance reduction OS and Networks SHANG, Fanhua WEI, Bingkun LIU, Hongying LIU, Yuanyuan ZHOU, Pan GONG, Maoguo Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning |
description |
Recently, stochastic hard thresholding (HT) optimization methods [e.g., stochastic variance reduced gradient hard thresholding (SVRGHT)] are becoming more attractive for solving large-scale sparsity/rank-constrained problems. However, they have much higher HT oracle complexities, especially for high-dimensional data or large-scale matrices. To address this issue and inspired by the well-known Gradient Support Pursuit (GraSP) method, this article proposes a new Relaxed Gradient Support Pursuit (RGraSP) framework. Unlike GraSP, RGraSP only requires to yield an approximation solution at each iteration. Based on the property of RGraSP, we also present an efficient stochastic variance reduction-gradient support pursuit algorithm and its fast version (called stochastic variance reduced gradient support pursuit (SVRGSP+). We prove that the gradient oracle complexity of both our algorithms is two times less than that of SVRGHT. In particular, their HT complexity is about κsˆ times less than that of SVRGHT, where κsˆ is the restricted condition number. Moreover, we prove that our algorithms enjoy fast linear convergence to an approximately global optimum, and also present an asynchronous parallel variant to deal with very high-dimensional and sparse data. Experimental results on both synthetic and real-world datasets show that our algorithms yield superior results than the state-of-the-art gradient HT methods. |
format |
text |
author |
SHANG, Fanhua WEI, Bingkun LIU, Hongying LIU, Yuanyuan ZHOU, Pan GONG, Maoguo |
author_facet |
SHANG, Fanhua WEI, Bingkun LIU, Hongying LIU, Yuanyuan ZHOU, Pan GONG, Maoguo |
author_sort |
SHANG, Fanhua |
title |
Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning |
title_short |
Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning |
title_full |
Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning |
title_fullStr |
Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning |
title_full_unstemmed |
Efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning |
title_sort |
efficient gradient support pursuit with less hard thresholding for cardinality-constrained learning |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2021 |
url |
https://ink.library.smu.edu.sg/sis_research/9049 |
_version_ |
1814047717732646912 |