PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Boosting Algorithms for Maximizing the Soft Margin
Manfred Warmuth, Karen Glocer and Gunnar Raetsch
In: NIPS 2007, 4-8 Dec 2007.


We present a novel boosting algorithm, called Softboost, designed for sets of binary labeled examples that are not necessarily separable by convex combinations of base hypotheses. Our algorithm achieves robustness by capping the distributions on the examples. Our update of the distribution is motivated by minimizing a relative entropy subject to the capping constraints and constraints on the edges of the obtained base hypotheses. The capping constraints imply a soft margin in the dual optimization problem and our algorithm produces a convex combination of hypotheses whose soft margin is within delta of the optimum. We employ relative entropy projection methods to prove an O(ln N/delta^2) iteration bound for our algorithm, where N is number of examples. We compare our algorithm with other approaches including LPBoost and Smooth- Boost. We show that there exist cases where the number of iterations required by LPBoost grows linearly in N instead of the logarithmic growth for SoftBoost. In simulation studies we show that our algorithm converges much faster than SmoothBoost and about as fast as LPBoost. In a benchmark comparison we il- lustrate the competitiveness of our approach.

EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:3139
Deposited By:Gunnar Raetsch
Deposited On:21 December 2007