Parametric polynomial time perceptron rescaling algorithm
Algorithms and Complexity in Durham 2006: Proceedings of the Second ACiD Workshop
Texts in Algorithmics
College Publications, King's College
, London, UK
Let us consider a linear feasibility problem with a possibly infinite number of inequality constraints posed in an on-line setting: an algorithm suggests a candidate solution, and the oracle either confirms its feasibility, or outputs a violated constraint vector. This model can be solved by subgradient optimisation algorithms for non-smooth functions, also known as the perceptron algorithms in the machine learning community, and its solvability depends on the problem dimension and the radius of the constraint set.
The classical perceptron algorithm may have an exponential complexity in the worst case when the radius is infinitesimal. To overcome this difficulty, the space dilation technique was exploited in the ellipsoid algorithm to make its running time polynomial. A special case of the space dilation, the
rescaling procedure is utilised in the perceptron rescaling algorithm with a probabilistic approach to choosing the direction of dilation.
A parametric version of the perceptron rescaling algorithm is the focus of this work. It is demonstrated that some fixed parameters of the latter algorithm (the initial estimate of the radius and the relaxation parameter) may be modified and adapted for particular problems. The generalised theoretical
framework allows to determine convergence of the algorithm with any chosen set of values of these parameters, and suggests a potential way of decreasing the complexity of the algorithm which remains the subject of current research.