PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Fast Rates for Regularized Objectives
karthik Sridharan, Shai Shalev-Shwartz and Nathan Srebro
In: NIPS 2008, Dec 2008, Vancouver.

Abstract

We show that the empirical minimizer of a stochastic strongly convex objective, where the stochastic component is linear, converges to the population minimizer with rate $O(1/n)$. The result applies, in particular, to the SVM objective. Thus, we obtain a rate of $O(1/n)$ on the convergence of the SVM objective to its infinite data limit. We demonstrate how this is essential for obtaining tight oracle inequalities for SVMs. The results extend also to strong convexity with respect to other norms, and so also to objectives regularized using other $\ellnorm_p$ norms.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Theory & Algorithms
ID Code:5423
Deposited By:Shai Shalev-Shwartz
Deposited On:02 July 2009