When Gaussian process meets big data : a review of scalable GPs

The vast quantity of information brought by big data as well as the evolving computer hardware encourages success stories in the machine learning community. In the meanwhile, it poses challenges for the Gaussian process regression (GPR), a well-known nonparametric, and interpretable Bayesian model,...

Full description

Saved in:
Bibliographic Details
Main Authors: Liu, Haitao, Ong, Yew-Soon, Shen, Xiaobo, Cai, Jianfei
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2021
Subjects:
Online Access:https://hdl.handle.net/10356/148176
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-148176
record_format dspace
spelling sg-ntu-dr.10356-1481762021-04-19T02:52:22Z When Gaussian process meets big data : a review of scalable GPs Liu, Haitao Ong, Yew-Soon Shen, Xiaobo Cai, Jianfei School of Computer Science and Engineering Rolls-Royce@NTU Corporate Lab Data Science and Artificial Intelligence Research Centre Engineering::Computer science and engineering Big Data Gaussian Process Regression The vast quantity of information brought by big data as well as the evolving computer hardware encourages success stories in the machine learning community. In the meanwhile, it poses challenges for the Gaussian process regression (GPR), a well-known nonparametric, and interpretable Bayesian model, which suffers from cubic complexity to data size. To improve the scalability while retaining desirable prediction quality, a variety of scalable GPs have been presented. However, they have not yet been comprehensively reviewed and analyzed to be well understood by both academia and industry. The review of scalable GPs in the GP community is timely and important due to the explosion of data size. To this end, this article is devoted to reviewing state-of-The-Art scalable GPs involving two main categories: global approximations that distillate the entire data and local approximations that divide the data for subspace learning. Particularly, for global approximations, we mainly focus on sparse approximations comprising prior approximations that modify the prior but perform exact inference, posterior approximations that retain exact prior but perform approximate inference, and structured sparse approximations that exploit specific structures in kernel matrix; for local approximations, we highlight the mixture/product of experts that conducts model averaging from multiple local experts to boost predictions. To present a complete review, recent advances for improving the scalability and capability of scalable GPs are reviewed. Finally, the extensions and open issues of scalable GPs in various scenarios are reviewed and discussed to inspire novel ideas for future research avenues. Nanyang Technological University National Research Foundation (NRF) Accepted version This work was supported in part by the Rolls-Royce@NTU Corporate Laboratory, National Research Foundation (NRF) Singapore, through the Corp Lab@University Scheme, and in part by the Data Science and Artificial Intelligence Research Center (DSAIR) and the School of Computer Science and Engineering, Nanyang Technological University. 2021-04-19T02:52:21Z 2021-04-19T02:52:21Z 2020 Journal Article Liu, H., Ong, Y., Shen, X. & Cai, J. (2020). When Gaussian process meets big data : a review of scalable GPs. IEEE Transactions On Neural Networks and Learning Systems, 31(11), 4405-4423. https://dx.doi.org/10.1109/TNNLS.2019.2957109 2162-237X 0000-0003-1187-5374 0000-0002-4480-169X 0000-0002-9444-3763 https://hdl.handle.net/10356/148176 10.1109/TNNLS.2019.2957109 31 2-s2.0-85094982665 11 31 4405 4423 en IEEE Transactions on Neural Networks and Learning Systems © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/TNNLS.2019.2957109. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Big Data
Gaussian Process Regression
spellingShingle Engineering::Computer science and engineering
Big Data
Gaussian Process Regression
Liu, Haitao
Ong, Yew-Soon
Shen, Xiaobo
Cai, Jianfei
When Gaussian process meets big data : a review of scalable GPs
description The vast quantity of information brought by big data as well as the evolving computer hardware encourages success stories in the machine learning community. In the meanwhile, it poses challenges for the Gaussian process regression (GPR), a well-known nonparametric, and interpretable Bayesian model, which suffers from cubic complexity to data size. To improve the scalability while retaining desirable prediction quality, a variety of scalable GPs have been presented. However, they have not yet been comprehensively reviewed and analyzed to be well understood by both academia and industry. The review of scalable GPs in the GP community is timely and important due to the explosion of data size. To this end, this article is devoted to reviewing state-of-The-Art scalable GPs involving two main categories: global approximations that distillate the entire data and local approximations that divide the data for subspace learning. Particularly, for global approximations, we mainly focus on sparse approximations comprising prior approximations that modify the prior but perform exact inference, posterior approximations that retain exact prior but perform approximate inference, and structured sparse approximations that exploit specific structures in kernel matrix; for local approximations, we highlight the mixture/product of experts that conducts model averaging from multiple local experts to boost predictions. To present a complete review, recent advances for improving the scalability and capability of scalable GPs are reviewed. Finally, the extensions and open issues of scalable GPs in various scenarios are reviewed and discussed to inspire novel ideas for future research avenues.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Liu, Haitao
Ong, Yew-Soon
Shen, Xiaobo
Cai, Jianfei
format Article
author Liu, Haitao
Ong, Yew-Soon
Shen, Xiaobo
Cai, Jianfei
author_sort Liu, Haitao
title When Gaussian process meets big data : a review of scalable GPs
title_short When Gaussian process meets big data : a review of scalable GPs
title_full When Gaussian process meets big data : a review of scalable GPs
title_fullStr When Gaussian process meets big data : a review of scalable GPs
title_full_unstemmed When Gaussian process meets big data : a review of scalable GPs
title_sort when gaussian process meets big data : a review of scalable gps
publishDate 2021
url https://hdl.handle.net/10356/148176
_version_ 1698713710257766400