PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Efficient inference in matrix-variate Gaussian models with iid observation noise
Oliver Stegle, Christoph Lippert, Joris Mooij, Neil Lawrence and Karsten Borgwardt
In: Advances in Neural Information Processing Systems 24 (NIPS*2011), December 12-14, 2011, Granada, Spain.

Abstract

Inference in matrix-variate Gaussian models has major applications for multi-output prediction and joint learning of row and column covariances from matrix-variate data. Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a low-rank confounding covariance between samples. We show practical utility on applications to biology, where we model covariances with more than 100,000 dimensions. We find greater accuracy in recovering biological network structures and are able to better reconstruct the confounders.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Theory & Algorithms
ID Code:8964
Deposited By:Joris Mooij
Deposited On:21 February 2012