Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds

First-order non-convex Riemannian optimization algorithms have gained recent popularity in structured machine learning problems including principal component analysis and low-rank matrix completion. The current paper presents an efficient Riemannian Stochastic Path Integrated Differential EstimatoR...

Full description

Saved in:
Bibliographic Details
Main Authors: ZHOU, Pan, YUAN, Xiao-Tong, YAN, Shuicheng, FENG, Jiashi
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2019
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8990
https://ink.library.smu.edu.sg/context/sis_research/article/9993/viewcontent/2019_TPAMI_manifold.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9993
record_format dspace
spelling sg-smu-ink.sis_research-99932024-07-25T08:26:53Z Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds ZHOU, Pan YUAN, Xiao-Tong YAN, Shuicheng FENG, Jiashi First-order non-convex Riemannian optimization algorithms have gained recent popularity in structured machine learning problems including principal component analysis and low-rank matrix completion. The current paper presents an efficient Riemannian Stochastic Path Integrated Differential EstimatoR (R-SPIDER) algorithm to solve the finite-sum and online Riemannian non-convex minimization problems. At the core of R-SPIDER is a recursive semi-stochastic gradient estimator that can accurately estimate Riemannian gradient under not only exponential mapping and parallel transport, but also general retraction and vector transport operations. Compared with prior Riemannian algorithms, such a recursive gradient estimation mechanism endows R-SPIDER with higher computational efficiency in first-order oracle complexity. Specifically, for finite-sum problems with n components, R-SPIDER is proved to converge to an -accuracy stationary point within O min n + √n 2 , 1 3 stochastic gradient evaluations, beating the best-known complexity O n + 1 4 ; for online optimization, R-SPIDER is shown to converge with O 1 3 complexity which is, to the best of our knowledge, the first non-asymptotic result for online Riemannian optimization. For the special case of gradient dominated functions, we further develop a variant of R-SPIDER with improved linear rate of convergence. Extensive experimental results demonstrate the advantage of the proposed algorithms over the state-of-the-art Riemannian non-convex optimization methods. 2019-08-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8990 info:doi/10.1109/TPAMI.2019.2933841 https://ink.library.smu.edu.sg/context/sis_research/article/9993/viewcontent/2019_TPAMI_manifold.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Riemannian Optimization Stochastic Variance-Reduced Algorithm Non-convex Optimization Online Lear Theory and Algorithms
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Riemannian Optimization
Stochastic Variance-Reduced Algorithm
Non-convex Optimization
Online Lear
Theory and Algorithms
spellingShingle Riemannian Optimization
Stochastic Variance-Reduced Algorithm
Non-convex Optimization
Online Lear
Theory and Algorithms
ZHOU, Pan
YUAN, Xiao-Tong
YAN, Shuicheng
FENG, Jiashi
Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
description First-order non-convex Riemannian optimization algorithms have gained recent popularity in structured machine learning problems including principal component analysis and low-rank matrix completion. The current paper presents an efficient Riemannian Stochastic Path Integrated Differential EstimatoR (R-SPIDER) algorithm to solve the finite-sum and online Riemannian non-convex minimization problems. At the core of R-SPIDER is a recursive semi-stochastic gradient estimator that can accurately estimate Riemannian gradient under not only exponential mapping and parallel transport, but also general retraction and vector transport operations. Compared with prior Riemannian algorithms, such a recursive gradient estimation mechanism endows R-SPIDER with higher computational efficiency in first-order oracle complexity. Specifically, for finite-sum problems with n components, R-SPIDER is proved to converge to an -accuracy stationary point within O min n + √n 2 , 1 3 stochastic gradient evaluations, beating the best-known complexity O n + 1 4 ; for online optimization, R-SPIDER is shown to converge with O 1 3 complexity which is, to the best of our knowledge, the first non-asymptotic result for online Riemannian optimization. For the special case of gradient dominated functions, we further develop a variant of R-SPIDER with improved linear rate of convergence. Extensive experimental results demonstrate the advantage of the proposed algorithms over the state-of-the-art Riemannian non-convex optimization methods.
format text
author ZHOU, Pan
YUAN, Xiao-Tong
YAN, Shuicheng
FENG, Jiashi
author_facet ZHOU, Pan
YUAN, Xiao-Tong
YAN, Shuicheng
FENG, Jiashi
author_sort ZHOU, Pan
title Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
title_short Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
title_full Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
title_fullStr Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
title_full_unstemmed Faster first-order methods for stochastic non-convex optimization on Riemannian manifolds
title_sort faster first-order methods for stochastic non-convex optimization on riemannian manifolds
publisher Institutional Knowledge at Singapore Management University
publishDate 2019
url https://ink.library.smu.edu.sg/sis_research/8990
https://ink.library.smu.edu.sg/context/sis_research/article/9993/viewcontent/2019_TPAMI_manifold.pdf
_version_ 1814047702256713728