Modulating scalable Gaussian processes for expressive statistical learning
For a learning task, Gaussian process (GP) is interested in learning the statistical relationship between inputs and outputs, since it offers not only the prediction mean but also the associated variability. The vanilla GP however is hard to learn complicated distribution with the property of, e.g.,...
Saved in:
Main Authors: | Liu, Haitao, Ong, Yew-Soon, Jiang, Xiaomo, Wang, Xiaofang |
---|---|
Other Authors: | School of Computer Science and Engineering |
Format: | Article |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/162582 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Understanding and comparing scalable Gaussian process regression for big data
by: Liu, Haitao, et al.
Published: (2020) -
When Gaussian process meets big data : a review of scalable GPs
by: Liu, Haitao, et al.
Published: (2021) -
Remarks on multi-output Gaussian process regression
by: Liu, Haitao, et al.
Published: (2020) -
Cope with diverse data structures in multi-fidelity modeling : a Gaussian process method
by: Liu, Haitao, et al.
Published: (2020) -
NEW ADVANCES IN BAYESIAN INFERENCE FOR GAUSSIAN PROCESS AND DEEP GAUSSIAN PROCESS MODELS
by: YU HAIBIN
Published: (2020)