Scalable GAM using sparse variational Gaussian processes

12/28/2018
by   Vincent Adam, et al.
0

Generalized additive models (GAMs) are a widely used class of models of interest to statisticians as they provide a flexible way to design interpretable models of data beyond linear models. We here propose a scalable and well-calibrated Bayesian treatment of GAMs using Gaussian processes (GPs) and leveraging recent advances in variational inference. We use sparse GPs to represent each component and exploit the additive structure of the model to efficiently represent a Gaussian a posteriori coupling between the components.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2017

Doubly Stochastic Variational Inference for Deep Gaussian Processes

Gaussian processes (GPs) are a good choice for function approximation as...
research
04/18/2016

Chained Gaussian Processes

Gaussian process models are flexible, Bayesian non-parametric approaches...
research
10/29/2018

Variational Calibration of Computer Models

Bayesian calibration of black-box computer models offers an established ...
research
02/01/2023

Short-term Prediction and Filtering of Solar Power Using State-Space Gaussian Processes

Short-term forecasting of solar photovoltaic energy (PV) production is i...
research
03/21/2018

Scalable Generalized Dynamic Topic Models

Dynamic topic models (DTMs) model the evolution of prevalent themes in l...
research
04/29/2023

Representing Additive Gaussian Processes by Sparse Matrices

Among generalized additive models, additive Matérn Gaussian Processes (G...
research
06/02/2021

Connections and Equivalences between the Nyström Method and Sparse Variational Gaussian Processes

We investigate the connections between sparse approximation methods for ...

Please sign up or login with your details

Forgot password? Click here to reset