## AbstractWe present a general and efficient optimization methodology for max-margin structured classification tasks. The efficiency of the method relies on the interplay of several techniques: formulation of the structured support vector machine or max-margin Markov problem as an optimization problem; marginalization of the dual of the optimization; partial decomposition via a gradient formulation; and finally tight coupling of a maximum likelihood inference algorithm into the optimization algorithm, as opposed to using inference as a working set maintenance mechanism only. The tight coupling also allows fast approximate inference to be used effectively in the learning. The generality of the method follows from the fact that changing the output structure in essence only changes the inference algorithm, that is, the method can to a large extent be used in a `plug and play' fashion.
[Edit] |