PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Linear Programming, Belief Propagation and Low Level Vision
Yair Weiss
In: Emphasis Week on Learning and Inference in Low and Mid Level Vision, 23 - 25 February 2005, Berkeley, CA.


Linear Programming (LP) relaxations are a standard method in computer science for approximately solving combinatorial optimizations. I will briefly describe our failed attempts to apply LP relaxations to problems in low-level vision. The number of variables and constraints is simply too much for general purpose LP solvers. I will then describe a recent surprising result by Wainwright, Jaakkola and Willsky (2003) that show how to solve LP relaxations using a variant of Belief Propagation. Our theoretical contribution is an extension of the Wainwright et al. result for relaxation results that are partly integer and partly fractional. Experimentally, we use our results to find the global optimum of a widely used energy function for stereo vision. On a standard benchmark set for which many approximation algorithms have been applied, we are able to find the global optimum in a few minutes. Joint work with Talya Meltzer and Chen Yanover.

EPrint Type:Conference or Workshop Item (Invited Talk)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Machine Vision
Learning/Statistics & Optimisation
ID Code:1067
Deposited By:Yair Weiss
Deposited On:04 September 2005