Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields with Mean-ﬁeld Inference
Vibhav Vineet, Jonathan Warrell, Paul Sturgess and Philip Torr
In: BMVC 2012, 3-7 September 2012, Surrey, UK.
Recently, Krahenbuhl and Koltun proposed an efﬁcient inference method for densely connected pairwise random ﬁelds using the mean-ﬁeld approximation for a Conditional
Random Field (CRF). However, they restrict their pairwise weights to take the form of a weighted combination of Gaussian kernels where each Gaussian component is allowed to take only zero mean, and can only be rescaled by a single value for each label pair. Further, their method is sensitive to initialization. In this paper, we propose methods to alleviate these issues. First, we propose a hierarchical mean-ﬁeld approach where labelling from the coarser level is propagated to the ﬁner level for better initialisation. Further, we use SIFT-ﬂow based label transfer to provide a good initial condition at the coarsest level. Second, we allow our approach to take general Gaussian pairwise weights, where we learn the mean, the co-variance matrix, and the mixing co-efﬁcient for every mixture component. We propose a variation of Expectation Maximization (EM) for piecewise learning of the parameters of the mixture model determined by the maximum likelihood
function. Finally, we demonstrate the efﬁciency and accuracy offered by our method for object class segmentation problems on two challenging datasets: PascalVOC-10 segmentation and CamVid datasets. We show that we are able to achieve state of the art performance on the CamVid dataset, and an almost 3% improvement on the PascalVOC-10 dataset compared to baseline graph-cut and mean-ﬁeld methods, while also reducing the inference time by almost a factor of 3 compared to graph-cuts based methods.
|EPrint Type:||Conference or Workshop Item (Paper)|
|Project Keyword:||Project Keyword UNSPECIFIED|
|Deposited By:||Sunando Sengupta|
|Deposited On:||19 October 2012|