Skip to content

Latest commit

 

History

History
59 lines (59 loc) · 2.06 KB

2017-07-17-cutajar17a.md

File metadata and controls

59 lines (59 loc) · 2.06 KB
title booktitle year volume series address month publisher pdf url abstract layout id tex_title bibtex_author firstpage lastpage page order cycles editor author date container-title genre issued extras
Random Feature Expansions for Deep Gaussian Processes
Proceedings of the 34th International Conference on Machine Learning
2017
70
Proceedings of Machine Learning Research
0
PMLR
The composition of multiple Gaussian Processes as a Deep Gaussian Process DGP enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work we introduce a novel formulation of DGPs based on random feature expansions that we train using stochastic variational inference. This yields a practical learning framework which significantly advances the state-of-the-art in inference for DGPs, and enables accurate quantification of uncertainty. We extensively showcase the scalability and performance of our proposal on several datasets with up to 8 million observations, and various DGP architectures with up to 30 hidden layers.
inproceedings
cutajar17a
Random Feature Expansions for Deep {G}aussian Processes
Kurt Cutajar and Edwin V. Bonilla and Pietro Michiardi and Maurizio Filippone
884
893
884-893
884
false
given family
Doina
Precup
given family
Yee Whye
Teh
given family
Kurt
Cutajar
given family
Edwin V.
Bonilla
given family
Pietro
Michiardi
given family
Maurizio
Filippone
2017-07-17
Proceedings of the 34th International Conference on Machine Learning
inproceedings
date-parts
2017
7
17