Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference

Gradient-based Monte Carlo sampling algorithms, like Langevin dynamics and Hamiltonian Monte Carlo, are important methods for Bayesian inference. In large-scale settings, full-gradients are not affordable and thus stochastic gradients evaluated on mini-batches are used as a replacement. In order to...

全面介紹

Saved in:
書目詳細資料
Main Authors: LI, Zhize, ZHANG, Tianyi, CHENG, Shuyu, ZHU, Jun, LI, Jian
格式: text
語言:English
出版: Institutional Knowledge at Singapore Management University 2019
主題:
在線閱讀:https://ink.library.smu.edu.sg/sis_research/8689
https://ink.library.smu.edu.sg/context/sis_research/article/9692/viewcontent/ML19_vrhmc.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
id sg-smu-ink.sis_research-9692
record_format dspace
spelling sg-smu-ink.sis_research-96922024-03-28T08:45:18Z Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference LI, Zhize ZHANG, Tianyi CHENG, Shuyu ZHU, Jun LI, Jian Gradient-based Monte Carlo sampling algorithms, like Langevin dynamics and Hamiltonian Monte Carlo, are important methods for Bayesian inference. In large-scale settings, full-gradients are not affordable and thus stochastic gradients evaluated on mini-batches are used as a replacement. In order to reduce the high variance of noisy stochastic gradients, Dubey et al. (in: Advances in neural information processing systems, pp 1154–1162, 2016) applied the standard variance reduction technique on stochastic gradient Langevin dynamics and obtained both theoretical and experimental improvements. In this paper, we apply the variance reduction tricks on Hamiltonian Monte Carlo and achieve better theoretical convergence results compared with the variance-reduced Langevin dynamics. Moreover, we apply the symmetric splitting scheme in our variance-reduced Hamiltonian Monte Carlo algorithms to further improve the theoretical results. The experimental results are also consistent with the theoretical results. As our experiment shows, variance-reduced Hamiltonian Monte Carlo demonstrates better performance than variance-reduced Langevin dynamics in Bayesian regression and classification tasks on real-world datasets. 2019-07-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8689 info:doi/10.1007/s10994-019-05825-y https://ink.library.smu.edu.sg/context/sis_research/article/9692/viewcontent/ML19_vrhmc.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Hamiltonian Monte Carlo Variance reduction Bayesian inference Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Hamiltonian Monte Carlo
Variance reduction
Bayesian inference
Databases and Information Systems
spellingShingle Hamiltonian Monte Carlo
Variance reduction
Bayesian inference
Databases and Information Systems
LI, Zhize
ZHANG, Tianyi
CHENG, Shuyu
ZHU, Jun
LI, Jian
Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
description Gradient-based Monte Carlo sampling algorithms, like Langevin dynamics and Hamiltonian Monte Carlo, are important methods for Bayesian inference. In large-scale settings, full-gradients are not affordable and thus stochastic gradients evaluated on mini-batches are used as a replacement. In order to reduce the high variance of noisy stochastic gradients, Dubey et al. (in: Advances in neural information processing systems, pp 1154–1162, 2016) applied the standard variance reduction technique on stochastic gradient Langevin dynamics and obtained both theoretical and experimental improvements. In this paper, we apply the variance reduction tricks on Hamiltonian Monte Carlo and achieve better theoretical convergence results compared with the variance-reduced Langevin dynamics. Moreover, we apply the symmetric splitting scheme in our variance-reduced Hamiltonian Monte Carlo algorithms to further improve the theoretical results. The experimental results are also consistent with the theoretical results. As our experiment shows, variance-reduced Hamiltonian Monte Carlo demonstrates better performance than variance-reduced Langevin dynamics in Bayesian regression and classification tasks on real-world datasets.
format text
author LI, Zhize
ZHANG, Tianyi
CHENG, Shuyu
ZHU, Jun
LI, Jian
author_facet LI, Zhize
ZHANG, Tianyi
CHENG, Shuyu
ZHU, Jun
LI, Jian
author_sort LI, Zhize
title Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
title_short Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
title_full Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
title_fullStr Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
title_full_unstemmed Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
title_sort stochastic gradient hamiltonian monte carlo with variance reduction for bayesian inference
publisher Institutional Knowledge at Singapore Management University
publishDate 2019
url https://ink.library.smu.edu.sg/sis_research/8689
https://ink.library.smu.edu.sg/context/sis_research/article/9692/viewcontent/ML19_vrhmc.pdf
_version_ 1795302173776543744