How Good are Low-Rank Approximations in Gaussian Process Regression?

Abstract

We provide guarantees for approximate Gaussian process regression resulting from two common low-rank kernel approximations based on random Fourier features and truncating the kernel’s Mercer expansion. In particular, we bound the Kullback-Leibler divergence between an exact Gaussian process and one resulting from one of the afore-described low-rank approximations to its kernel, as well as between their corresponding predictive densities. We provide experiments on both simulated data and standard benchmarks showing the effectiveness of our theoretical bounds.

Publication
In 36th AAAI Conference on Artificial Intelligence
Aristeidis Panos
Aristeidis Panos
Research Associate in Machine Learning