PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Gaussian processes and fast matrix-vector multiplies
Iain Murray
In: Numerical Mathematics in Machine Learning workshop at the 26th International Conference on Machine Learning (ICML 2009), 18 June 2009, Montreal, Canada.


Gaussian processes (GPs) provide a flexible framework for probabilistic regression. The necessary computations involve standard matrix operations. There have been several attempts to accelerate these operations based on fast kernel matrix-vector multiplications. By focussing on the simplest GP computation, corresponding to test-time predictions in kernel ridge regression, we conclude that simple approximations based on clusterings in a kd-tree can never work well for simple regression problems. Analytical expansions can provide speedups, but current implementations are limited to the squared-exponential kernel and low-dimensional problems. We discuss future directions.

PDF (Extended abstract to accompany talk.) - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Talk)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:5942
Deposited By:Iain Murray
Deposited On:08 March 2010