PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Multiplicative updates for nonnegative projections
Zhirong Yang and Jorma Laaksonen
Neurocomputing Volume 71, Number 1-3, pp. 363-373, 2007.

Abstract

We present here how to construct multiplicative update rules for non-negative projections based on Oja's iterative learning rule. Our method integrates the multiplicative normalization factor into the original additive update rule as an additional term which generally has a roughly opposite direction. As a consequence, the modified additive learning rule can easily be converted to its multiplicative version, which maintains the non-negativity after each iteration. The derivation of our approach provides a sound interpretation of learning non- negative projection matrices based on iterative multiplicative updates--a kind of Hebbian learning with normalization. A convergence analysis is scratched by interpretating the multiplicative updates as a special case of natural gradient learning. We also demonstrate two application examples of the proposed technique, a non-negative variant of the linear Hebbian networks and a non-negative Fisher discriminant analysis, including its kernel extension. The resulting example algorithms demonstrate interesting properties for data analysis tasks in experiments performed on facial images.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:3632
Deposited By:Jorma Laaksonen
Deposited On:14 February 2008