PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent
Antoine Bordes, Leon Bottou and Patrick Gallinari
Journal of Machine Learning Research Volume 10, pp. 1737-1754, 2009. ISSN 1533-7928


The SGD-QN algorithm is a stochastic gradient descent algorithm that makes careful use of second-order information and splits the parameter update into independently scheduled components. Thanks to this design, SGD-QN iterates nearly as fast as a first-order stochastic gradient descent but requires less iterations to achieve the same accuracy. This algorithm won the “Wild Track” of the first PASCAL Large Scale Learning Challenge (Sonnenburg et al., 2008).

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Theory & Algorithms
ID Code:6505
Deposited By:antoine Bordes
Deposited On:08 March 2010