PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

A Universal Machine Learning Optimization Framework For Arbitrary Outputs
Sandor Szedmak and Zakria Hussain
(2009) Working Paper. Sandor Szedmak, Southampton, UK.

Abstract

We derive a general Convex Linearly Constrained Program (CLCP) parameterized by a matrix $\G$, constructed from the information given by the input-output pairs. The CLCP then chooses a set of regularization and loss functions in order to impose constraints for the learning task. We show that several algorithms, including the SVM, LPBoost, Ridge Regression etc., can be solved using the same optimization framework when the appropriate choice of $\G$, regularization and loss function are made. Due to this unification we show that if $\G$ is constructed from more complex input-output paired information then we can solve more difficult problems such as structured output learning, with the same complexity as a regression/classification problem. We discuss various different forms of $\G$ and then show on some real world enzyme prediction tasks, requiring structured outputs, that our method performs as well as the state-of-the-art.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Monograph (Working Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
ID Code:5502
Deposited By:Sandor Szedmak
Deposited On:11 December 2009