Linear Programming, Belief Propagation and Low Level Vision
Linear Programming (LP) relaxations are a standard method in computer science for approximately solving combinatorial optimizations. I will briefly describe our failed attempts to apply LP relaxations to problems in low-level vision. The number of variables and constraints is simply too much for general purpose LP solvers. I will then describe a recent surprising result by Wainwright, Jaakkola and Willsky (2003) that show how to solve LP relaxations using a variant of Belief Propagation. Our theoretical contribution is an extension of the Wainwright et al. result for relaxation results that are partly integer and partly fractional. Experimentally, we use our results to find the global optimum of a widely used energy function for stereo vision. On a standard benchmark set for which many approximation algorithms have been applied, we are able to find the global optimum in a few minutes. Joint work with Talya Meltzer and Chen Yanover.