PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Kick-starting GPLVM Optimization via a Connection to Metric MDS
Sebastian Bitzer and Christopher Williams
In: NIPS 2010 Workshop on Challenges of Data Visualization, 11 Dec 2010, Whistler, British Columbia, Canada.


The Gaussian Process Latent Variable Model (GPLVM) [1] is an attractive model for dimensionality reduction, but the optimization of the GPLVM likelihood with respect to the latent point locations is difficult, and prone to local optima. Here we start from the insight that in the GPLVM, we should have that k(x_i , x_j ) \simeq s_ij , where k(x_i , x_j ) is the kernel function evaluated at latent points x_i and x_j , and s_ij is the corresponding estimate from the data. For an isotropic covariance function this relationship can be inverted to yield an estimate of the interpoint distances {d_ij } in the latent space, and these can be fed into a multidimensional scaling (MDS) algorithm. This yields an initial estimate of the latent locations, which can be subsequently optimized in the usual GPLVM fashion. We compare two variants of this approach to the standard PCA initialization and to the ISOMAP algorithm [2], and show that our initialization converges to the best GPLVM likelihoods on all six tested motion capture data sets.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
ID Code:7651
Deposited By:Christopher Williams
Deposited On:17 March 2011